DC-ASTGCN: EEG Emotion Recognition Based on Fusion Deep Convolutional and Adaptive Spatio-temporal Graph Convolutional Networks.
IEEE J Biomed Health Inform
; PP2024 Sep 05.
Article
em En
| MEDLINE
| ID: mdl-39236139
ABSTRACT
Thanks to advancements in artificial intelligence and brain-computer interface (BCI) research, there has been increasing attention towards emotion recognition techniques based on electro encephalogram (EEG) recently. The complexity of EEG data poses a challenge when it comes to accurately classifying emotions by integrating time, frequency, and spatial domain features. To address this challenge, this paper proposes a fusion model called DC-ASTGCN, which combines the strengths of deep convolutional neural network (DCNN) and adaptive spatiotemporal graphic convolutional neural network (ASTGCN) to comprehensively analyze and understand EEG signals. The DCNN focuses on extracting frequency-domain and local spatial features from EEG signals to identify brain region activity patterns, while the ASTGCN, with its spatiotemporal attention mechanism and adaptive brain topology layer, reveals the functional connectivity features between brain regions in different emotional states. This integration significantly enhances the model's ability to understand and recognize emotional states. Extensive experiments conducted on the DEAP and SEED datasets demonstrate that the DC-ASTGCN model outperforms existing state-of the-art methods in terms of emotion recognition accuracy.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Idioma:
En
Revista:
IEEE J Biomed Health Inform
Ano de publicação:
2024
Tipo de documento:
Article
País de publicação:
Estados Unidos