Your browser doesn't support javascript.
loading
E-MFNN: an emotion-multimodal fusion neural network framework for emotion recognition.
Guo, Zhuen; Yang, Mingqing; Lin, Li; Li, Jisong; Zhang, Shuyue; He, Qianbo; Gao, Jiaqi; Meng, Heling; Chen, Xinran; Tao, Yuehao; Yang, Chen.
Afiliación
  • Guo Z; School of Mechanical Engineering, Guizhou University, Guiyang, Guizhou, China.
  • Yang M; School of Mechanical Engineering, Guizhou University, Guiyang, Guizhou, China.
  • Lin L; School of Mechanical Engineering, Guizhou University, Guiyang, Guizhou, China.
  • Li J; School of Mechanical Engineering, Guizhou University, Guiyang, Guizhou, China.
  • Zhang S; University of North Alabama, Florence, AL, United States.
  • He Q; North Alabama International College of Engineering and Technology, Guizhou University, Guiyang, Guizhou, China.
  • Gao J; University of North Alabama, Florence, AL, United States.
  • Meng H; North Alabama International College of Engineering and Technology, Guizhou University, Guiyang, Guizhou, China.
  • Chen X; University of North Alabama, Florence, AL, United States.
  • Tao Y; North Alabama International College of Engineering and Technology, Guizhou University, Guiyang, Guizhou, China.
  • Yang C; School of Mechanical Engineering, Guizhou University, Guiyang, Guizhou, China.
PeerJ Comput Sci ; 10: e1977, 2024.
Article en En | MEDLINE | ID: mdl-38660191
ABSTRACT
Emotional recognition is a pivotal research domain in computer and cognitive science. Recent advancements have led to various emotion recognition methods, leveraging data from diverse sources like speech, facial expressions, electroencephalogram (EEG), electrocardiogram, and eye tracking (ET). This article introduces a novel emotion recognition framework, primarily targeting the analysis of users' psychological reactions and stimuli. It is important to note that the stimuli eliciting emotional responses are as critical as the responses themselves. Hence, our approach synergizes stimulus data with physical and physiological signals, pioneering a multimodal method for emotional cognition. Our proposed framework unites stimulus source data with physiological signals, aiming to enhance the accuracy and robustness of emotion recognition through data integration. We initiated an emotional cognition experiment to gather EEG and ET data alongside recording emotional responses. Building on this, we developed the Emotion-Multimodal Fusion Neural Network (E-MFNN), optimized for multimodal data fusion to process both stimulus and physiological data. We conducted extensive comparisons between our framework's outcomes and those from existing models, also assessing various algorithmic approaches within our framework. This comparison underscores our framework's efficacy in multimodal emotion recognition. The source code is publicly available at https//figshare.com/s/8833d837871c78542b29.
Palabras clave

Texto completo: 1 Base de datos: MEDLINE Idioma: En Revista: PeerJ Comput Sci Año: 2024 Tipo del documento: Article

Texto completo: 1 Base de datos: MEDLINE Idioma: En Revista: PeerJ Comput Sci Año: 2024 Tipo del documento: Article