Your browser doesn't support javascript.
loading
Music emotion recognition based on temporal convolutional attention network using EEG.
Qiao, Yinghao; Mu, Jiajia; Xie, Jialan; Hu, Binghui; Liu, Guangyuan.
Afiliación
  • Qiao Y; School of Electronic and Information Engineering, Southwest University, Chongqing, China.
  • Mu J; Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China.
  • Xie J; Chongqing Key Laboratory of Nonlinear Circuits and Intelligent Information Processing, Southwest University, Chongqing, China.
  • Hu B; School of Electronic and Information Engineering, Southwest University, Chongqing, China.
  • Liu G; Institute of Affective Computing and Information Processing, Southwest University, Chongqing, China.
Front Hum Neurosci ; 18: 1324897, 2024.
Article en En | MEDLINE | ID: mdl-38617132
ABSTRACT
Music is one of the primary ways to evoke human emotions. However, the feeling of music is subjective, making it difficult to determine which emotions music triggers in a given individual. In order to correctly identify emotional problems caused by different types of music, we first created an electroencephalogram (EEG) data set stimulated by four different types of music (fear, happiness, calm, and sadness). Secondly, the differential entropy features of EEG were extracted, and then the emotion recognition model CNN-SA-BiLSTM was established to extract the temporal features of EEG, and the recognition performance of the model was improved by using the global perception ability of the self-attention mechanism. The effectiveness of the model was further verified by the ablation experiment. The classification accuracy of this method in the valence and arousal dimensions is 93.45% and 96.36%, respectively. By applying our method to a publicly available EEG dataset DEAP, we evaluated the generalization and reliability of our method. In addition, we further investigate the effects of different EEG bands and multi-band combinations on music emotion recognition, and the results confirm relevant neuroscience studies. Compared with other representative music emotion recognition works, this method has better classification performance, and provides a promising framework for the future research of emotion recognition system based on brain computer interface.
Palabras clave

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Front Hum Neurosci Año: 2024 Tipo del documento: Article País de afiliación: China Pais de publicación: Suiza

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Front Hum Neurosci Año: 2024 Tipo del documento: Article País de afiliación: China Pais de publicación: Suiza