Your browser doesn't support javascript.
loading
The blenderFace method: video-based measurement of raw movement data during facial expressions of emotion using open-source software.
Zinkernagel, Axel; Alexandrowicz, Rainer W; Lischetzke, Tanja; Schmitt, Manfred.
Afiliación
  • Zinkernagel A; Personality, Psychological Assessment, and Psychological Methods, University of Koblenz-Landau, Fortstr. 7, 76829, Landau, Germany. zinkernagel@uni-landau.de.
  • Alexandrowicz RW; Abteilung für Angewandte Psychologie und Methodenforschung, Alpen-Adria-Universität Klagenfurt, Universitätsstraße 65-67, 9020, Klagenfurt am Wörthersee, Austria.
  • Lischetzke T; Personality, Psychological Assessment, and Psychological Methods, University of Koblenz-Landau, Fortstr. 7, 76829, Landau, Germany.
  • Schmitt M; Personality, Psychological Assessment, and Psychological Methods, University of Koblenz-Landau, Fortstr. 7, 76829, Landau, Germany.
Behav Res Methods ; 51(2): 747-768, 2019 04.
Article en En | MEDLINE | ID: mdl-30076534
This article proposes an optical measurement of movement applied to data from video recordings of facial expressions of emotion. The approach offers a way to capture motion adapted from the film industry in which markers placed on the skin of the face can be tracked with a pattern-matching algorithm. The method records and postprocesses raw facial movement data (coordinates per frame) of distinctly placed markers and is intended for use in facial expression research (e.g., microexpressions) in laboratory settings. Due to the explicit use of specifically placed, artificial markers, the procedure offers the simultaneous measurement of several emotionally relevant markers in a (psychometrically) objective and artifact-free way, even for facial regions without natural landmarks (e.g., the cheeks). In addition, the proposed procedure is fully based on open-source software and is transparent at every step of data processing. Two worked examples demonstrate the practicability of the proposed procedure: In Study 1(N= 39), the participants were instructed to show the emotions happiness, sadness, disgust, and anger, and in Study 2 (N= 113), they were asked to present both a neutral face and the emotions happiness, disgust, and fear. Study 2 involved the simultaneous tracking of 14 markers for approximately 12 min per participant with a time resolution of 33 ms. The measured facial movements corresponded closely to the assumptions of established measurement instruments (EMFACS, FACSAID, Friesen & Ekman, 1983; Ekman & Hager, 2002). In addition, the measurement was found to be very precise with sub-second, sub-pixel, and sub-millimeter accuracy.
Asunto(s)
Palabras clave

Texto completo: 1 Colección: 01-internacional Banco de datos: MEDLINE Asunto principal: Grabación en Video / Programas Informáticos / Emociones / Expresión Facial / Movimiento Límite: Humans Idioma: En Revista: Behav Res Methods Asunto de la revista: CIENCIAS DO COMPORTAMENTO Año: 2019 Tipo del documento: Article País de afiliación: Alemania

Texto completo: 1 Colección: 01-internacional Banco de datos: MEDLINE Asunto principal: Grabación en Video / Programas Informáticos / Emociones / Expresión Facial / Movimiento Límite: Humans Idioma: En Revista: Behav Res Methods Asunto de la revista: CIENCIAS DO COMPORTAMENTO Año: 2019 Tipo del documento: Article País de afiliación: Alemania