Your browser doesn't support javascript.
loading
Self-Supervised Transformer Model Training for a Sleep-EEG Foundation Model.
bioRxiv ; 2024 May 01.
Article em En | MEDLINE | ID: mdl-38293234
ABSTRACT
The American Academy of Sleep Medicine (AASM) recognizes five sleep/wake states (Wake, N1, N2, N3, REM), yet this classification schema provides only a high-level summary of sleep and likely overlooks important neurological or health information. New, data-driven approaches are needed to more deeply probe the information content of sleep signals. Here we present a self-supervised approach that learns the structure embedded in large quantities of neurophysiological sleep data. This masked transformer training procedure is inspired by high performing self-supervised methods developed for speech transcription. We show that self-supervised pre-training matches or outperforms supervised sleep stage classification, especially when labeled data or compute-power is limited. Perhaps more importantly, we also show that our pretrained model is flexible and can be fine-tuned to perform well on new tasks including distinguishing individuals and quantifying "brain age" (a potential health biomarker). This suggests that modern methods can automatically learn information that is potentially overlooked by the 5-class sleep staging schema, laying the groundwork for new schemas and further data-driven exploration of sleep.

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: BioRxiv Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: BioRxiv Ano de publicação: 2024 Tipo de documento: Article