A robust multi-branch multi-attention-mechanism EEGNet for motor imagery BCI decoding.
J Neurosci Methods
; 405: 110108, 2024 May.
Article
en En
| MEDLINE
| ID: mdl-38458260
ABSTRACT
BACKGROUND:
Motor-Imagery-based Brain-Computer Interface (MI-BCI) is a promising technology to assist communication, movement, and neurological rehabilitation for motor-impaired individuals. Electroencephalography (EEG) decoding techniques using deep learning (DL) possess noteworthy advantages due to automatic feature extraction and end-to-end learning. However, the DL-based EEG decoding models tend to show large variations due to intersubject variability of EEG, which results from inconsistencies of different subjects' optimal hyperparameters. NEWMETHODS:
This study proposes a multi-branch multi-attention mechanism EEGNet model (MBMANet) for robust decoding. It applies the multi-branch EEGNet structure to achieve various feature extractions. Further, the different attention mechanisms introduced in each branch attain diverse adaptive weight adjustments. This combination of multi-branch and multi-attention mechanisms allows for multi-level feature fusion to provide robust decoding for different subjects.RESULTS:
The MBMANet model has a four-classification accuracy of 83.18% and kappa of 0.776 on the BCI Competition IV-2a dataset, which outperforms other eight CNN-based decoding models. This consistently satisfactory performance across all nine subjects indicates that the proposed model is robust.CONCLUSIONS:
The combine of multi-branch and multi-attention mechanisms empowers the DL-based models to adaptively learn different EEG features, which provides a feasible solution for dealing with data variability. It also gives the MBMANet model more accurate decoding of motion intentions and lower training costs, thus improving the MI-BCI's utility and robustness.Palabras clave
Texto completo:
1
Colección:
01-internacional
Base de datos:
MEDLINE
Asunto principal:
Interfaces Cerebro-Computador
Límite:
Humans
Idioma:
En
Revista:
J Neurosci Methods
Año:
2024
Tipo del documento:
Article