Insights from EEG analysis of evoked memory recalls using deep learning for emotion charting.
Sci Rep
; 14(1): 17080, 2024 07 24.
Article
em En
| MEDLINE
| ID: mdl-39048599
ABSTRACT
Affect recognition in a real-world, less constrained environment is the principal prerequisite of the industrial-level usefulness of this technology. Monitoring the psychological profile using smart, wearable electroencephalogram (EEG) sensors during daily activities without external stimuli, such as memory-induced emotions, is a challenging research gap in emotion recognition. This paper proposed a deep learning framework for improved memory-induced emotion recognition leveraging a combination of 1D-CNN and LSTM as feature extractors integrated with an Extreme Learning Machine (ELM) classifier. The proposed deep learning architecture, combined with the EEG preprocessing, such as the removal of the average baseline signal from each sample and extraction of EEG rhythms (delta, theta, alpha, beta, and gamma), aims to capture repetitive and continuous patterns for memory-induced emotion recognition, underexplored with deep learning techniques. This work has analyzed EEG signals using a wearable, ultra-mobile sports cap while recalling autobiographical emotional memories evoked by affect-denoting words, with self-annotation on the scale of valence and arousal. With extensive experimentation using the same dataset, the proposed framework empirically outperforms existing techniques for the emerging area of memory-induced emotion recognition with an accuracy of 65.6%. The EEG rhythms analysis, such as delta, theta, alpha, beta, and gamma, achieved 65.5%, 52.1%, 65.1%, 64.6%, and 65.0% accuracies for classification with four quadrants of valence and arousal. These results underscore the significant advancement achieved by our proposed method for the real-world environment of memory-induced emotion recognition.
Palavras-chave
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Rememoração Mental
/
Eletroencefalografia
/
Emoções
/
Aprendizado Profundo
Idioma:
En
Ano de publicação:
2024
Tipo de documento:
Article