Enhancing EEG Foundation Models via Dual-Branch Self-Distillation With Bi-Pretext Tasks
Salvato in:
| Pubblicato in: | ProQuest Dissertations and Theses (2025) |
|---|---|
| Autore principale: | |
| Pubblicazione: |
ProQuest Dissertations & Theses
|
| Soggetti: | |
| Accesso online: | Citation/Abstract Full Text - PDF |
| Tags: |
Nessun Tag, puoi essere il primo ad aggiungerne!!
|
| Abstract: | We present a dual-branch self-supervised learning framework for EEG representation learning, combining masked reconstruction and clustering-based objectives. Evaluated across five diverse downstream tasks, our method achieves state-of-the-art performance under both linear probing and fine-tuning protocols. Ablation and visualization analyses confirm the robustness and transferability of the learned features. Our approach offers a promising foundation for future advances in general-purpose EEG analysis. |
|---|---|
| ISBN: | 9798315778073 |
| Fonte: | ProQuest Dissertations & Theses Global |