Your browser doesn't support javascript.
loading
Spontaneous facial expression in unscripted social interactions can be measured automatically.
Girard, Jeffrey M; Cohn, Jeffrey F; Jeni, Laszlo A; Sayette, Michael A; De la Torre, Fernando.
Afiliação
  • Girard JM; Department of Psychology, University of Pittsburgh, Pittsburgh, PA, 15260, USA. jmg174@pitt.edu.
  • Cohn JF; Department of Psychology, University of Pittsburgh, Pittsburgh, PA, 15260, USA.
  • Jeni LA; The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, USA.
  • Sayette MA; The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, USA.
  • De la Torre F; Department of Psychology, University of Pittsburgh, Pittsburgh, PA, 15260, USA.
Behav Res Methods ; 47(4): 1136-1147, 2015 Dec.
Article em En | MEDLINE | ID: mdl-25488104
ABSTRACT
Methods to assess individual facial actions have potential to shed light on important behavioral phenomena ranging from emotion and social interaction to psychological disorders and health. However, manual coding of such actions is labor intensive and requires extensive training. To date, establishing reliable automated coding of unscripted facial actions has been a daunting challenge impeding development of psychological theories and applications requiring facial expression assessment. It is therefore essential that automated coding systems be developed with enough precision and robustness to ease the burden of manual coding in challenging data involving variation in participant gender, ethnicity, head pose, speech, and occlusion. We report a major advance in automated coding of spontaneous facial actions during an unscripted social interaction involving three strangers. For each participant (n = 80, 47 % women, 15 % Nonwhite), 25 facial action units (AUs) were manually coded from video using the Facial Action Coding System. Twelve AUs occurred more than 3 % of the time and were processed using automated FACS coding. Automated coding showed very strong reliability for the proportion of time that each AU occurred (mean intraclass correlation = 0.89), and the more stringent criterion of frame-by-frame reliability was moderate to strong (mean Matthew's correlation = 0.61). With few exceptions, differences in AU detection related to gender, ethnicity, pose, and average pixel intensity were small. Fewer than 6 % of frames could be coded manually but not automatically. These findings suggest automated FACS coding has progressed sufficiently to be applied to observational research in emotion and related areas of study.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Emoções / Expressão Facial / Relações Interpessoais Tipo de estudo: Clinical_trials Limite: Adult / Female / Humans / Male Idioma: En Revista: Behav Res Methods Assunto da revista: CIENCIAS DO COMPORTAMENTO Ano de publicação: 2015 Tipo de documento: Article País de afiliação: Estados Unidos

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Emoções / Expressão Facial / Relações Interpessoais Tipo de estudo: Clinical_trials Limite: Adult / Female / Humans / Male Idioma: En Revista: Behav Res Methods Assunto da revista: CIENCIAS DO COMPORTAMENTO Ano de publicação: 2015 Tipo de documento: Article País de afiliação: Estados Unidos