Enhancing EEG Foundation Models via Dual-Branch Self-Distillation With Bi-Pretext Tasks

Guardado en:
Detalles Bibliográficos
Publicado en:ProQuest Dissertations and Theses (2025)
Autor principal: Hung, Wei-Lun Allen
Publicado:
ProQuest Dissertations & Theses
Materias:
Acceso en línea:Citation/Abstract
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!

MARC

LEADER 00000nab a2200000uu 4500
001 3214379286
003 UK-CbPIL
020 |a 9798315778073 
035 |a 3214379286 
045 2 |b d20250101  |b d20251231 
084 |a 66569  |2 nlm 
100 1 |a Hung, Wei-Lun Allen 
245 1 |a Enhancing EEG Foundation Models via Dual-Branch Self-Distillation With Bi-Pretext Tasks 
260 |b ProQuest Dissertations & Theses  |c 2025 
513 |a Dissertation/Thesis 
520 3 |a We present a dual-branch self-supervised learning framework for EEG representation learning, combining masked reconstruction and clustering-based objectives. Evaluated across five diverse downstream tasks, our method achieves state-of-the-art performance under both linear probing and fine-tuning protocols. Ablation and visualization analyses confirm the robustness and transferability of the learned features. Our approach offers a promising foundation for future advances in general-purpose EEG analysis. 
653 |a Computer science 
653 |a Computer engineering 
653 |a Information technology 
773 0 |t ProQuest Dissertations and Theses  |g (2025) 
786 0 |d ProQuest  |t ProQuest Dissertations & Theses Global 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3214379286/abstract/embedded/L8HZQI7Z43R0LA5T?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3214379286/fulltextPDF/embedded/L8HZQI7Z43R0LA5T?source=fedsrch