Your browser doesn't support javascript.
loading
Instance-representation transfer method based on joint distribution and deep adaptation for EEG emotion recognition.
Zhu, Lei; Yu, Fei; Huang, Aiai; Ying, Nanjiao; Zhang, Jianhai.
Afiliação
  • Zhu L; School of Automation, Hangzhou Dianzi University, Hangzhou, 310000, China. zhulei@hdu.edu.cn.
  • Yu F; School of Automation, Hangzhou Dianzi University, Hangzhou, 310000, China.
  • Huang A; School of Automation, Hangzhou Dianzi University, Hangzhou, 310000, China.
  • Ying N; School of Automation, Hangzhou Dianzi University, Hangzhou, 310000, China.
  • Zhang J; Center for Drug Inspection of Zhejiang Province, Hangzhou, 310000, China.
Med Biol Eng Comput ; 62(2): 479-493, 2024 Feb.
Article em En | MEDLINE | ID: mdl-37914959
Electroencephalogram (EEG) emotion recognition technology is essential for improving human-computer interaction. However, the practical application of emotion recognition technology is limited due to the variety of subjects and sessions. Transfer learning has been applied to address this issue and has received extensive research and application. Studies mainly concentrate on either instance transfer or representation transfer methods. This paper proposes an emotion recognition method called Joint Distributed Instances Represent Transfer (JD-IRT), which includes two core components: Joint Distribution Deep Adaptation (JDDA) and Instance-Representation Transfer (I-RT). Specifically, JDDA is different from common representation transfer methods in transfer learning. It bridges the discrepancies of marginal and conditional distributions simultaneously and combines multiple adaptive layers and kernels for deep domain adaptation. On the other hand, I-RT utilizes instance transfer to select source domain data for better representation transfer. We performed experiments and compared them with other representative methods in the SEED, SEED-IV, and SEED-V datasets. In cross-subject experiments, our approach achieved an average accuracy of 83.21% in SEED, 52.12% in SEED-IV, and 60.17% in SEED-V. Similarly, in cross-session experiments, the accuracy was 91.29% in SEED, 59.02% in SEED-IV, and 65.91% in SEED-V. These results demonstrate the improvement in the accuracy of EEG emotion recognition using the proposed approach.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Eletroencefalografia / Emoções Limite: Humans Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Eletroencefalografia / Emoções Limite: Humans Idioma: En Ano de publicação: 2024 Tipo de documento: Article