Enhancing EEG Foundation Models via Dual-Branch Self-Distillation With Bi-Pretext Tasks
保存先:
| 出版年: | ProQuest Dissertations and Theses (2025) |
|---|---|
| 第一著者: | |
| 出版事項: |
ProQuest Dissertations & Theses
|
| 主題: | |
| オンライン・アクセス: | Citation/Abstract Full Text - PDF |
| タグ: |
タグなし, このレコードへの初めてのタグを付けませんか!
|
| 抄録: | We present a dual-branch self-supervised learning framework for EEG representation learning, combining masked reconstruction and clustering-based objectives. Evaluated across five diverse downstream tasks, our method achieves state-of-the-art performance under both linear probing and fine-tuning protocols. Ablation and visualization analyses confirm the robustness and transferability of the learned features. Our approach offers a promising foundation for future advances in general-purpose EEG analysis. |
|---|---|
| ISBN: | 9798315778073 |
| ソース: | ProQuest Dissertations & Theses Global |