Your browser doesn't support javascript.
loading
eSEE-d: Emotional State Estimation Based on Eye-Tracking Dataset.
Skaramagkas, Vasileios; Ktistakis, Emmanouil; Manousos, Dimitris; Kazantzaki, Eleni; Tachos, Nikolaos S; Tripoliti, Evanthia; Fotiadis, Dimitrios I; Tsiknakis, Manolis.
Afiliação
  • Skaramagkas V; Institute of Computer Science, Foundation for Research and Technology Hellas (FORTH), GR-700 13 Heraklion, Greece.
  • Ktistakis E; Department of Electrical and Computer Engineering, Hellenic Mediterranean University, GR-710 04 Heraklion, Greece.
  • Manousos D; Institute of Computer Science, Foundation for Research and Technology Hellas (FORTH), GR-700 13 Heraklion, Greece.
  • Kazantzaki E; Laboratory of Optics and Vision, School of Medicine, University of Crete, GR-710 03 Heraklion, Greece.
  • Tachos NS; Institute of Computer Science, Foundation for Research and Technology Hellas (FORTH), GR-700 13 Heraklion, Greece.
  • Tripoliti E; Institute of Computer Science, Foundation for Research and Technology Hellas (FORTH), GR-700 13 Heraklion, Greece.
  • Fotiadis DI; Biomedical Research Institute, Foundation for Research and Technology Hellas (FORTH), GR-451 10 Ioannina, Greece.
  • Tsiknakis M; Department of Materials Science and Engineering, Unit of Medical Technology and Intelligent Information Systems, University of Ioannina, GR-451 10 Ioannina, Greece.
Brain Sci ; 13(4)2023 Mar 30.
Article em En | MEDLINE | ID: mdl-37190554
ABSTRACT
Affective state estimation is a research field that has gained increased attention from the research community in the last decade. Two of the main catalysts for this are the advancement in the data analysis using artificial intelligence and the availability of high-quality video. Unfortunately, benchmarks and public datasets are limited, thus making the development of new methodologies and the implementation of comparative studies essential. The current work presents the eSEE-d database, which is a resource to be used for emotional State Estimation based on Eye-tracking data. Eye movements of 48 participants were recorded as they watched 10 emotion-evoking videos, each of them followed by a neutral video. Participants rated four emotions (tenderness, anger, disgust, sadness) on a scale from 0 to 10, which was later translated in terms of emotional arousal and valence levels. Furthermore, each participant filled three self-assessment questionnaires. An extensive analysis of the participants' answers to the questionnaires' self-assessment scores as well as their ratings during the experiments is presented. Moreover, eye and gaze features were extracted from the low-level eye-recorded metrics, and their correlations with the participants' ratings are investigated. Finally, we take on the challenge to classify arousal and valence levels based solely on eye and gaze features, leading to promising results. In particular, the Deep Multilayer Perceptron (DMLP) network we developed achieved an accuracy of 92% in distinguishing positive valence from non-positive and 81% in distinguishing low arousal from medium arousal. The dataset is made publicly available.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Brain Sci Ano de publicação: 2023 Tipo de documento: Article País de afiliação: Grécia

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Brain Sci Ano de publicação: 2023 Tipo de documento: Article País de afiliação: Grécia
...