Your browser doesn't support javascript.
loading
Multimodal emotion recognition using EEG and eye tracking data.
Article em En | MEDLINE | ID: mdl-25571125
This paper presents a new emotion recognition method which combines electroencephalograph (EEG) signals and pupillary response collected from eye tracker. We select 15 emotional film clips of 3 categories (positive, neutral and negative). The EEG signals and eye tracking data of five participants are recorded, simultaneously, while watching these videos. We extract emotion-relevant features from EEG signals and eye tracing data of 12 experiments and build a fusion model to improve the performance of emotion recognition. The best average accuracies based on EEG signals and eye tracking data are 71.77% and 58.90%, respectively. We also achieve average accuracies of 73.59% and 72.98% for feature level fusion strategy and decision level fusion strategy, respectively. These results show that both feature level fusion and decision level fusion combining EEG signals and eye tracking data can improve the performance of emotion recognition model.
Assuntos

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Emoções Tipo de estudo: Prognostic_studies Limite: Adult / Female / Humans / Male Idioma: En Revista: Annu Int Conf IEEE Eng Med Biol Soc Ano de publicação: 2014 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Emoções Tipo de estudo: Prognostic_studies Limite: Adult / Female / Humans / Male Idioma: En Revista: Annu Int Conf IEEE Eng Med Biol Soc Ano de publicação: 2014 Tipo de documento: Article