Your browser doesn't support javascript.
loading
Cross-modal Guiding Neural Network for Multimodal Emotion Recognition from EEG and Eye Movement Signals.
Article en En | MEDLINE | ID: mdl-38917288
ABSTRACT
Multimodal emotion recognition research is gaining attention because of the emerging trend of integrating information from different sensory modalities to improve performance. Electroencephalogram (EEG) signals are considered objective indicators of emotions and provide precise insights despite their complex data collection. In contrast, eye movement signals are more susceptible to environmental and individual differences but offer convenient data collection. Conventional emotion recognition methods typically use separate models for different modalities, potentially overlooking their inherent connections. This study introduces a cross-modal guiding neural network designed to fully leverage the strengths of both modalities. The network includes a dual-branch feature extraction module that simultaneously extracts features from EEG and eye movement signals. In addition, the network includes a feature guidance module that uses EEG features to direct eye movement feature extraction, reducing the impact of subjective factors. This study also introduces a feature reweighting module to explore emotion-related features within eye movement signals, thereby improving emotion classification accuracy. The empirical findings from both the SEED-IV dataset and our collected dataset substantiate the commendable performance of the model, thereby confirming its efficacy.

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: IEEE J Biomed Health Inform Año: 2024 Tipo del documento: Article Pais de publicación: Estados Unidos

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: IEEE J Biomed Health Inform Año: 2024 Tipo del documento: Article Pais de publicación: Estados Unidos