- 2025-06-06: Repository anonymized to comply with the triple-blind reviewing policy of the ICDM conference.
- 2025-09-21: Paper accepted at the AI4TS Workshop @ ICDM 2025.
- Paper is in proceedings. Preprint available on arXiv.
EEGM2 consists of two stages:
- Self-supervised pretraining
- Downstream classification tasks
Pretrained models are provided in the Pretrained_EEGM2/ folder.
Dataset structure and preparation guidelines are described in the Datasets/ folder.
Model parameters and training settings can be configured in config.json (for pretraining) and config_downstream.json (for downstream tasks).
Instructions refer to Unix-based systems (e.g., Linux, macOS).
This code has been tested with Python 3.10.16. For users who prefer using Conda, environment YAML files are provided.
Create an environment from a specific file:
conda env create -f Env-requirement/Mamba2-env20250206.ymlpython SelfSupervised.py --config config.jsonpython Downstream.py --config config_downstream.jsonBalanced Accuracy and AUROC across varying input sequence lengths.
| Duration | Model | Balanced ACC | AUROC |
|---|---|---|---|
| 10 Seconds | EEGM2 (Light) | 79.14 ± 0.21 | 0.8559 ± 0.000 |
| EEGM2 (Fine) | 80.87 ± 0.54 | 0.8864 ± 0.000 | |
| 30 Seconds | EEGM2 (Light) | 78.97 ± 0.25 | 0.8575 ± 0.000 |
| EEGM2 (Fine) | 81.71 ± 0.12 | 0.8932 ± 0.000 | |
| 60 Seconds | EEGM2 (Light) | 76.94 ± 0.33 | 0.8257 ± 0.000 |
| EEGM2 (Fine) | 80.68 ± 0.45 | 0.8803 ± 0.000 | |
| 100 Seconds | EEGM2 (Light) | 74.57 ± 0.27 | 0.7986 ± 0.000 |
| EEGM2 (Fine) | 81.08 ± 0.28 | 0.8869 ± 0.000 |