Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 1.261
Filtrar
Mais filtros

Intervalo de ano de publicação
1.
Proc Natl Acad Sci U S A ; 121(14): e2313665121, 2024 Apr 02.
Artigo em Inglês | MEDLINE | ID: mdl-38530896

RESUMO

Facial emotion expressions play a central role in interpersonal interactions; these displays are used to predict and influence the behavior of others. Despite their importance, quantifying and analyzing the dynamics of brief facial emotion expressions remains an understudied methodological challenge. Here, we present a method that leverages machine learning and network modeling to assess the dynamics of facial expressions. Using video recordings of clinical interviews, we demonstrate the utility of this approach in a sample of 96 people diagnosed with psychotic disorders and 116 never-psychotic adults. Participants diagnosed with schizophrenia tended to move from neutral expressions to uncommon expressions (e.g., fear, surprise), whereas participants diagnosed with other psychoses (e.g., mood disorders with psychosis) moved toward expressions of sadness. This method has broad applications to the study of normal and altered expressions of emotion and can be integrated with telemedicine to improve psychiatric assessment and treatment.


Assuntos
Transtornos Psicóticos , Esquizofrenia , Adulto , Humanos , Expressão Facial , Emoções , Esquizofrenia/diagnóstico , Medo
2.
Cereb Cortex ; 34(3)2024 03 01.
Artigo em Inglês | MEDLINE | ID: mdl-38466112

RESUMO

Alexithymia is characterized by difficulties in emotional information processing. However, the underlying reasons for emotional processing deficits in alexithymia are not fully understood. The present study aimed to investigate the mechanism underlying emotional deficits in alexithymia. Using the Toronto Alexithymia Scale-20, we recruited college students with high alexithymia (n = 24) or low alexithymia (n = 24) in this study. Participants judged the emotional consistency of facial expressions and contextual sentences while recording their event-related potentials. Behaviorally, the high alexithymia group showed longer response times versus the low alexithymia group in processing facial expressions. The event-related potential results showed that the high alexithymia group had more negative-going N400 amplitudes compared with the low alexithymia group in the incongruent condition. More negative N400 amplitudes are also associated with slower responses to facial expressions. Furthermore, machine learning analyses based on N400 amplitudes could distinguish the high alexithymia group from the low alexithymia group in the incongruent condition. Overall, these findings suggest worse facial emotion perception for the high alexithymia group, potentially due to difficulty in spontaneously activating emotion concepts. Our findings have important implications for the affective science and clinical intervention of alexithymia-related affective disorders.


Assuntos
Sintomas Afetivos , Eletroencefalografia , Humanos , Feminino , Masculino , Expressão Facial , Potenciais Evocados , Emoções
3.
Cereb Cortex ; 34(4)2024 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-38566513

RESUMO

The perception of facial expression plays a crucial role in social communication, and it is known to be influenced by various facial cues. Previous studies have reported both positive and negative biases toward overweight individuals. It is unclear whether facial cues, such as facial weight, bias facial expression perception. Combining psychophysics and event-related potential technology, the current study adopted a cross-adaptation paradigm to examine this issue. The psychophysical results of Experiments 1A and 1B revealed a bidirectional cross-adaptation effect between overweight and angry faces. Adapting to overweight faces decreased the likelihood of perceiving ambiguous emotional expressions as angry compared to adapting to normal-weight faces. Likewise, exposure to angry faces subsequently caused normal-weight faces to appear thinner. These findings were corroborated by bidirectional event-related potential results, showing that adaptation to overweight faces relative to normal-weight faces modulated the event-related potential responses of emotionally ambiguous facial expression (Experiment 2A); vice versa, adaptation to angry faces relative to neutral faces modulated the event-related potential responses of ambiguous faces in facial weight (Experiment 2B). Our study provides direct evidence associating overweight faces with facial expression, suggesting at least partly common neural substrates for the perception of overweight and angry faces.


Assuntos
Expressão Facial , Preconceito de Peso , Humanos , Sobrepeso , Ira/fisiologia , Potenciais Evocados/fisiologia , Emoções/fisiologia
4.
J Neurosci ; 43(23): 4291-4303, 2023 06 07.
Artigo em Inglês | MEDLINE | ID: mdl-37142430

RESUMO

According to a classical view of face perception (Bruce and Young, 1986; Haxby et al., 2000), face identity and facial expression recognition are performed by separate neural substrates (ventral and lateral temporal face-selective regions, respectively). However, recent studies challenge this view, showing that expression valence can also be decoded from ventral regions (Skerry and Saxe, 2014; Li et al., 2019), and identity from lateral regions (Anzellotti and Caramazza, 2017). These findings could be reconciled with the classical view if regions specialized for one task (either identity or expression) contain a small amount of information for the other task (that enables above-chance decoding). In this case, we would expect representations in lateral regions to be more similar to representations in deep convolutional neural networks (DCNNs) trained to recognize facial expression than to representations in DCNNs trained to recognize face identity (the converse should hold for ventral regions). We tested this hypothesis by analyzing neural responses to faces varying in identity and expression. Representational dissimilarity matrices (RDMs) computed from human intracranial recordings (n = 11 adults; 7 females) were compared with RDMs from DCNNs trained to label either identity or expression. We found that RDMs from DCNNs trained to recognize identity correlated with intracranial recordings more strongly in all regions tested-even in regions classically hypothesized to be specialized for expression. These results deviate from the classical view, suggesting that face-selective ventral and lateral regions contribute to the representation of both identity and expression.SIGNIFICANCE STATEMENT Previous work proposed that separate brain regions are specialized for the recognition of face identity and facial expression. However, identity and expression recognition mechanisms might share common brain regions instead. We tested these alternatives using deep neural networks and intracranial recordings from face-selective brain regions. Deep neural networks trained to recognize identity and networks trained to recognize expression learned representations that correlate with neural recordings. Identity-trained representations correlated with intracranial recordings more strongly in all regions tested, including regions hypothesized to be expression specialized in the classical hypothesis. These findings support the view that identity and expression recognition rely on common brain regions. This discovery may require reevaluation of the roles that the ventral and lateral neural pathways play in processing socially relevant stimuli.


Assuntos
Eletrocorticografia , Reconhecimento Facial , Adulto , Feminino , Humanos , Encéfalo , Redes Neurais de Computação , Reconhecimento Facial/fisiologia , Lobo Temporal/fisiologia , Mapeamento Encefálico , Imageamento por Ressonância Magnética/métodos
5.
FASEB J ; 37(9): e23137, 2023 09.
Artigo em Inglês | MEDLINE | ID: mdl-37566489

RESUMO

The anatomical underpinnings of primate facial expressions are essential to exploring their evolution. Traditionally, it has been accepted that the primate face exhibits a "scala natura" morphocline, ranging from primitive to derived characteristics. At the primitive end, the face consists of undifferentiated muscular sheets, while at the derived end there is greater complexity with more muscles and insertion points. Among these, the role of the human modiolus ("knoten" in German) has been emphasized. Recent studies have challenged this view by revealing significant complexity in the faces of several non-human primates, thereby rejecting the linear notion of facial evolution. However, our knowledge of the facial architecture in gorillas, the second closest living relatives to modern humans, remains a significant gap in the literature. Here, we present new findings based on dissection and histological analysis of one gorilla craniofacial specimen, alongside 30 human hemifaces. Our results indicate that while the number and overall arrangement of facial muscles in the gorilla are comparable to those of chimpanzees and modern humans, several orofacial features distinguish the gorilla's anatomy from that of hominins. Among these are the absence of a modiolus, the continuity of muscular fibers over the region of the mouth corner, the flat (uncurving) sheet of the orbicularis oris muscle, and the insertion of direct labial tractors both anterior and posterior to it. Collectively, the anatomical characteristics observed in the gorilla suggest that the complex anatomy of the hominin face should be considered synapomorphic (shared-derived) within the Pan-Homo clade.


Assuntos
Hominidae , Animais , Gorilla gorilla/anatomia & histologia , Músculos Faciais/anatomia & histologia , Músculos Faciais/fisiologia , Face , Pan troglodytes/anatomia & histologia
6.
Arch Sex Behav ; 53(1): 223-233, 2024 01.
Artigo em Inglês | MEDLINE | ID: mdl-37626260

RESUMO

This study explored the facial expression stereotypes of adult men and women within the Chinese cultural context and investigated whether adult participants had facial expression stereotypes of children aged 6 and 10 years old. Three experiments were conducted with 156 adult Chinese university student participants. Experiment 1 explored whether adult participants had facial expression stereotypes of adult men and women. In Experiment 1a, the participants imagined a happy or angry adult face and stated the gender of the imagined face. In Experiment 1b, the participants were asked to quickly judge the gender of happy or angry adult faces, and their response time was recorded. Experiments 2 and 3 explored whether adults apply the stereotypes of adult men and women to 10-year-old and 6-year-old children. Experiment 1 revealed that the participants associated angry facial expressions with men and happy facial expressions with women. Experiment 2 showed that the participants associated angry facial expressions with 10-year-old boys and happy expressions with 10-year-old girls. Finally, Experiment 3 revealed that the participants associated happy facial expressions with 6-year-old girls but did not associate angry facial expressions with 6-year-old boys. These results showed that, within the Chinese cultural context, adults had gender-based facial expression stereotypes of adults and 10-year-old children; however, the adult participants did not have gender-based facial expression stereotypes of 6-year-old male children. This study has important implications for future research, as adults' perceptions of children is an important aspect in the study of social cognition in children.


Assuntos
Emoções , Expressão Facial , Adulto , Criança , Feminino , Humanos , Masculino , Emoções/fisiologia , Felicidade , Tempo de Reação , População do Leste Asiático
7.
BMC Psychiatry ; 24(1): 226, 2024 Mar 26.
Artigo em Inglês | MEDLINE | ID: mdl-38532335

RESUMO

BACKGROUND: Patients with schizophrenia (SCZ) exhibit difficulties deficits in recognizing facial expressions with unambiguous valence. However, only a limited number of studies have examined how these patients fare in interpreting facial expressions with ambiguous valence (for example, surprise). Thus, we aimed to explore the influence of emotional background information on the recognition of ambiguous facial expressions in SCZ. METHODS: A 3 (emotion: negative, neutral, and positive) × 2 (group: healthy controls and SCZ) experimental design was adopted in the present study. The experimental materials consisted of 36 images of negative emotions, 36 images of neutral emotions, 36 images of positive emotions, and 36 images of surprised facial expressions. In each trial, a briefly presented surprised face was preceded by an affective image. Participants (36 SCZ and 36 healthy controls (HC)) were required to rate their emotional experience induced by the surprised facial expressions. Participants' emotional experience was measured using the 9-point rating scale. The experimental data have been analyzed by conducting analyses of variances (ANOVAs) and correlation analysis. RESULTS: First, the SCZ group reported a more positive emotional experience under the positive cued condition compared to the negative cued condition. Meanwhile, the HC group reported the strongest positive emotional experience in the positive cued condition, a moderate experience in the neutral cued condition, and the weakest in the negative cue condition. Second, the SCZ (vs. HC) group showed longer reaction times (RTs) for recognizing surprised facial expressions. The severity of schizophrenia symptoms in the SCZ group was negatively correlated with their rating scores for emotional experience under neutral and positive cued condition. CONCLUSIONS: Recognition of surprised facial expressions was influenced by background information in both SCZ and HC, and the negative symptoms in SCZ. The present study indicates that the role of background information should be fully considered when examining the ability of SCZ to recognize ambiguous facial expressions.


Assuntos
Reconhecimento Facial , Esquizofrenia , Humanos , Emoções , Reconhecimento Psicológico , Expressão Facial , China
8.
BMC Psychiatry ; 24(1): 184, 2024 Mar 06.
Artigo em Inglês | MEDLINE | ID: mdl-38448877

RESUMO

BACKGROUND: Eye contact is a fundamental part of social interaction. In clinical studies, it has been observed that patients suffering from depression make less eye contact during interviews than healthy individuals, which could be a factor contributing to their social functioning impairments. Similarly, results from mood induction studies with healthy persons indicate that attention to the eyes diminishes as a function of sad mood. The present screen-based eye-tracking study examined whether depressive symptoms in healthy individuals are associated with reduced visual attention to other persons' direct gaze during free viewing. METHODS: Gaze behavior of 44 individuals with depressive symptoms and 49 individuals with no depressive symptoms was analyzed in a free viewing task. Grouping was based on the Beck Depression Inventory using the cut-off proposed by Hautzinger et al. (2006). Participants saw pairs of faces with direct gaze showing emotional or neutral expressions. One-half of the face pairs was shown without face masks, whereas the other half was presented with face masks. Participants' dwell times and first fixation durations were analyzed. RESULTS: In case of unmasked facial expressions, participants with depressive symptoms looked shorter at the eyes compared to individuals without symptoms across all expression conditions. No group difference in first fixation duration on the eyes of masked and unmasked faces was observed. Individuals with depressive symptoms dwelled longer on the mouth region of unmasked faces. For masked faces, no significant group differences in dwell time on the eyes were found. Moreover, when specifically examining dwell time on the eyes of faces with an emotional expression there were also no significant differences between groups. Overall, participants gazed significantly longer at the eyes in masked compared to unmasked faces. CONCLUSIONS: For faces without mask, our results suggest that depressiveness in healthy individuals goes along with less visual attention to other persons' eyes but not with less visual attention to others' faces. When factors come into play that generally amplify the attention directed to the eyes such as face masks or emotions then no relationship between depressiveness and visual attention to the eyes can be established.


Assuntos
Afeto , Depressão , Humanos , Emoções , Nível de Saúde , Escalas de Graduação Psiquiátrica
9.
J Exp Child Psychol ; 243: 105928, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38643735

RESUMO

Previous studies have shown that adults exhibit the strongest attentional bias toward neutral infant faces when viewing faces with different expressions at different attentional processing stages due to different stimulus presentation times. However, it is not clear how the characteristics of the temporal processing associated with the strongest effect change over time. Thus, we combined a free-viewing task with eye-tracking technology to measure adults' attentional bias toward infant and adult faces with happy, neutral, and sad expressions of the same face. The results of the analysis of the total time course indicated that the strongest effect occurred during the strategic processing stage. However, the results of the analysis of the split time course revealed that sad infant faces first elicited adults' attentional bias at 0 to 500 ms, whereas the strongest effect of attentional bias toward neutral infant faces was observed at 1000 to 3000 ms, peaking at 1500 to 2000 ms. In addition, women and men had no differences in their responses to different expressions. In summary, this study provides further evidence that adults' attentional bias toward infant faces across stages of attention processing is modulated by expressions. Specifically, during automatic processing adults' attentional bias was directed toward sad infant faces, followed by a shift to the processing of neutral infant faces during strategic processing, which ultimately resulted in the strongest effect. These findings highlight that this strongest effect is dynamic and associated with a specific time window in the strategic process.


Assuntos
Viés de Atenção , Expressão Facial , Reconhecimento Facial , Humanos , Feminino , Masculino , Viés de Atenção/fisiologia , Adulto Jovem , Adulto , Reconhecimento Facial/fisiologia , Lactente , Tecnologia de Rastreamento Ocular , Atenção , Fatores de Tempo
10.
Proc Natl Acad Sci U S A ; 118(33)2021 08 17.
Artigo em Inglês | MEDLINE | ID: mdl-34385326

RESUMO

The last two decades have established that a network of face-selective areas in the temporal lobe of macaque monkeys supports the visual processing of faces. Each area within the network contains a large fraction of face-selective cells. And each area encodes facial identity and head orientation differently. A recent brain-imaging study discovered an area outside of this network selective for naturalistic facial motion, the middle dorsal (MD) face area. This finding offers the opportunity to determine whether coding principles revealed inside the core network would generalize to face areas outside the core network. We investigated the encoding of static faces and objects, facial identity, and head orientation, dimensions which had been studied in multiple areas of the core face-processing network before, as well as facial expressions and gaze. We found that MD populations form a face-selective cluster with a degree of selectivity comparable to that of areas in the core face-processing network. MD encodes facial identity robustly across changes in head orientation and expression, it encodes head orientation robustly against changes in identity and expression, and it encodes expression robustly across changes in identity and head orientation. These three dimensions are encoded in a separable manner. Furthermore, MD also encodes the direction of gaze in addition to head orientation. Thus, MD encodes both structural properties (identity) and changeable ones (expression and gaze) and thus provides information about another animal's direction of attention (head orientation and gaze). MD contains a heterogeneous population of cells that establish a multidimensional code for faces.


Assuntos
Expressão Facial , Reconhecimento Facial/fisiologia , Fixação Ocular/fisiologia , Percepção Visual/fisiologia , Animais , Fenômenos Eletrofisiológicos , Humanos , Macaca mulatta , Imageamento por Ressonância Magnética , Masculino , Reconhecimento Visual de Modelos/fisiologia
11.
Cogn Emot ; 38(1): 187-197, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-37731376

RESUMO

This study investigated the emotional and behavioural effects of looming threats using both recalled (self-reported valence) and real-time response measurements (facial expressions). The looming bias refers to the tendency to underestimate the time of arrival of rapidly approaching (looming) stimuli, providing additional time for defensive reactions. While previous research has shown negative emotional responses to looming threats based on self-reports after stimulus exposure, facial expressions offer valuable insights into emotional experiences and non-verbal behaviour during stimulus exposure. A face reading experiment examined responses to threats in motion, considering stimulus direction (looming versus receding motion) and threat strength (more versus less threatening stimuli). We also explored the added value of facial expression recognition compared to self-reported valence. Results indicated that looming threats elicit more negative facial expressions than receding threats, supporting previous findings on the looming bias. Further, more (vs. less) threatening stimuli evoked more negative facial expressions, but only when the threats were looming rather than receding. Interestingly, facial expressions of valence and self-reported valence showed opposing results, suggesting the importance of incorporating facial expression recognition to understand defensive responses to looming threats more comprehensively.


Assuntos
Reconhecimento Facial , Medo , Humanos
12.
Sensors (Basel) ; 24(13)2024 Jun 26.
Artigo em Inglês | MEDLINE | ID: mdl-39000930

RESUMO

Convolutional neural networks (CNNs) have made significant progress in the field of facial expression recognition (FER). However, due to challenges such as occlusion, lighting variations, and changes in head pose, facial expression recognition in real-world environments remains highly challenging. At the same time, methods solely based on CNN heavily rely on local spatial features, lack global information, and struggle to balance the relationship between computational complexity and recognition accuracy. Consequently, the CNN-based models still fall short in their ability to address FER adequately. To address these issues, we propose a lightweight facial expression recognition method based on a hybrid vision transformer. This method captures multi-scale facial features through an improved attention module, achieving richer feature integration, enhancing the network's perception of key facial expression regions, and improving feature extraction capabilities. Additionally, to further enhance the model's performance, we have designed the patch dropping (PD) module. This module aims to emulate the attention allocation mechanism of the human visual system for local features, guiding the network to focus on the most discriminative features, reducing the influence of irrelevant features, and intuitively lowering computational costs. Extensive experiments demonstrate that our approach significantly outperforms other methods, achieving an accuracy of 86.51% on RAF-DB and nearly 70% on FER2013, with a model size of only 3.64 MB. These results demonstrate that our method provides a new perspective for the field of facial expression recognition.


Assuntos
Expressão Facial , Redes Neurais de Computação , Humanos , Reconhecimento Facial Automatizado/métodos , Algoritmos , Processamento de Imagem Assistida por Computador/métodos , Face , Reconhecimento Automatizado de Padrão/métodos
13.
Sensors (Basel) ; 24(7)2024 Apr 04.
Artigo em Inglês | MEDLINE | ID: mdl-38610510

RESUMO

The perception of sound greatly impacts users' emotional states, expectations, affective relationships with products, and purchase decisions. Consequently, assessing the perceived quality of sounds through jury testing is crucial in product design. However, the subjective nature of jurors' responses may limit the accuracy and reliability of jury test outcomes. This research explores the utility of facial expression analysis in jury testing to enhance response reliability and mitigate subjectivity. Some quantitative indicators allow the research hypothesis to be validated, such as the correlation between jurors' emotional responses and valence values, the accuracy of jury tests, and the disparities between jurors' questionnaire responses and the emotions measured by FER (facial expression recognition). Specifically, analysis of attention levels during different statuses reveals a discernible decrease in attention levels, with 70 percent of jurors exhibiting reduced attention levels in the 'distracted' state and 62 percent in the 'heavy-eyed' state. On the other hand, regression analysis shows that the correlation between jurors' valence and their choices in the jury test increases when considering the data where the jurors are attentive. The correlation highlights the potential of facial expression analysis as a reliable tool for assessing juror engagement. The findings suggest that integrating facial expression recognition can enhance the accuracy of jury testing in product design by providing a more dependable assessment of user responses and deeper insights into participants' reactions to auditory stimuli.


Assuntos
Reconhecimento Facial , Humanos , Reprodutibilidade dos Testes , Acústica , Som , Emoções
14.
Eur Eat Disord Rev ; 32(5): 917-929, 2024 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-38708578

RESUMO

OBJECTIVE: The study investigated interpersonal distance in patients with anorexia nervosa (AN), focussing on the role of other's facial expression and morphology, also assessing physiological and subjective responses. METHOD: Twenty-nine patients with AN and 30 controls (CTL) were exposed to virtual characters either with an angry, neutral, or happy facial expression or with an overweight, normal-weight, or underweight morphology presented either in the near or far space while we recorded electrodermal activity. Participants had to judge their preferred interpersonal distance with the characters and rated them in terms of valence and arousal. RESULTS: Unlike CTL, patients with AN exhibited heightened electrodermal activity for morphological stimuli only, when presented in the near space. They also preferred larger and smaller interpersonal distances with overweight and underweight characters respectively, although rating both negatively. Finally, and similar to CTL, they preferred larger interpersonal distance with angry than neutral or happy characters. DISCUSSION: Although patients with AN exhibited behavioural response to emotional stimuli similar to CTL, they lacked corresponding physiological response, indicating emotional blunting towards emotional social stimuli. Moreover, they showed distinct behavioural and physiological adjustments in response to body shape, confirming the specific emotional significance attached to body shape.


Assuntos
Anorexia Nervosa , Emoções , Expressão Facial , Humanos , Anorexia Nervosa/psicologia , Feminino , Adulto , Emoções/fisiologia , Adulto Jovem , Imagem Corporal/psicologia , Relações Interpessoais , Resposta Galvânica da Pele/fisiologia , Adolescente , Distância Psicológica
15.
Cogn Process ; 25(2): 229-239, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38383909

RESUMO

Recent research shows that sensitivity to interoceptive sensitivity is associated with a more granular experience of emotions. These studies suggest that individuals sensitive to their interoceptive signals can better perceive somatic physiological changes as compared to their counterparts. Therefore, they discriminate among a wide and subtle range of emotions. Further, the perception of others' emotions could be based on our own emotional experiences. However, whether interoceptive sensitivity is related to the perception of others' emotions remains unclear. Therefore, this study examined the relationship between interoceptive sensitivity and emotional perception. Considering the model that emotion perception comprises two processes, categorization of facial expressions and approach-avoidance responses, this study examined both categorizations of facial expressions and approach-avoidance responses. The results showed no relationship between interoceptive sensitivity and the perception of emotion, which suggests that interoceptive sensitivity is related to the experience of emotion but does not affect the granularity of emotional perception. Future studies should diversely and empirically examine the role of the body in emotional perception from the perspective of interoceptive sensitivity.


Assuntos
Emoções , Expressão Facial , Interocepção , Percepção Social , Humanos , Emoções/fisiologia , Feminino , Interocepção/fisiologia , Adulto Jovem , Masculino , Adulto , Adolescente , Modelos Psicológicos , Reconhecimento Facial/fisiologia
16.
Ergonomics ; : 1-21, 2024 Jun 04.
Artigo em Inglês | MEDLINE | ID: mdl-38832783

RESUMO

The affective experience generated when users play computer games can influence their attitude and preference towards the game. Existing evaluation means mainly depend on subjective scales and physiological signals. However, some limitations should not be ignored (e.g. subjective scales are not objective, and physiological signals are complicated). In this paper, we 1) propose a novel method to assess user affective experience when playing single-player games based on pleasure-arousal-dominance (PAD) emotions, facial expressions, and gaze directions, and 2) build an artificial intelligence model to identify user preference. Fifty-four subjects participated in a basketball experiment with three difficulty levels. Their expressions, gaze directions, and subjective PAD emotions were collected and analysed. Experimental results showed that the expression intensities of angry, sad, and neutral, yaw angle degrees of gaze direction, and PAD emotions varied significantly under different difficulties. Besides, the proposed model achieved better performance than other machine-learning algorithms on the collected dataset.


This paper considers the limitations of existing methods for assessing user affective experience when playing computer games. It demonstrates a novel approach using subjective emotion and objective facial cues to identify user affective experience and user preference for the game.

17.
Behav Res Methods ; 2024 Jan 25.
Artigo em Inglês | MEDLINE | ID: mdl-38273072

RESUMO

Facial expressions are among the earliest behaviors infants use to express emotional states, and are crucial to preverbal social interaction. Manual coding of infant facial expressions, however, is laborious and poses limitations to replicability. Recent developments in computer vision have advanced automated facial expression analyses in adults, providing reproducible results at lower time investment. Baby FaceReader 9 is commercially available software for automated measurement of infant facial expressions, but has received little validation. We compared Baby FaceReader 9 output to manual micro-coding of positive, negative, or neutral facial expressions in a longitudinal dataset of 58 infants at 4 and 8 months of age during naturalistic face-to-face interactions with the mother, father, and an unfamiliar adult. Baby FaceReader 9's global emotional valence formula yielded reasonable classification accuracy (AUC = .81) for discriminating manually coded positive from negative/neutral facial expressions; however, the discrimination of negative from neutral facial expressions was not reliable (AUC = .58). Automatically detected a priori action unit (AU) configurations for distinguishing positive from negative facial expressions based on existing literature were also not reliable. A parsimonious approach using only automatically detected smiling (AU12) yielded good performance for discriminating positive from negative/neutral facial expressions (AUC = .86). Likewise, automatically detected brow lowering (AU3+AU4) reliably distinguished neutral from negative facial expressions (AUC = .79). These results provide initial support for the use of selected automatically detected individual facial actions to index positive and negative affect in young infants, but shed doubt on the accuracy of complex a priori formulas.

18.
J Headache Pain ; 25(1): 33, 2024 Mar 11.
Artigo em Inglês | MEDLINE | ID: mdl-38462615

RESUMO

BACKGROUND: The present study used the Facial Action Coding System (FACS) to analyse changes in facial activities in individuals with migraine during resting conditions to determine the potential of facial expressions to convey information about pain during headache episodes. METHODS: Facial activity was recorded in calm and resting conditions by using a camera for both healthy controls (HC) and patients with episodic migraine (EM) and chronic migraine (CM). The FACS was employed to analyse the collected facial images, and intensity scores for each of the 20 action units (AUs) representing expressions were generated. The groups and headache pain conditions were then examined for each AU. RESULTS: The study involved 304 participants, that is, 46 HCs, 174 patients with EM, and 84 patients with CM. Elevated headache pain levels were associated with increased lid tightener activity and reduced mouth stretch. In the CM group, moderate to severe headache attacks exhibited decreased activation in the mouth stretch, alongside increased activation in the lid tightener, nose wrinkle, and cheek raiser, compared to mild headache attacks (all corrected p < 0.05). Notably, lid tightener activation was positively correlated with the Numeric Rating Scale (NRS) level of headache (p = 0.012). Moreover, the lip corner depressor was identified to be indicative of emotional depression severity (p < 0.001). CONCLUSION: Facial expressions, particularly lid tightener actions, served as inherent indicators of headache intensity in individuals with migraine, even during resting conditions. This indicates that the proposed approach holds promise for providing a subjective evaluation of headaches, offering the benefits of real-time assessment and convenience for patients with migraine.


Assuntos
Expressão Facial , Transtornos de Enxaqueca , Humanos , Transtornos de Enxaqueca/complicações , Cefaleia , Dor , Depressão
19.
Medicina (Kaunas) ; 60(2)2024 Jan 28.
Artigo em Inglês | MEDLINE | ID: mdl-38399511

RESUMO

Background and Objectives: No studies have reported corrugator muscle activity associated with pain in people with pain. This study aimed to develop an objective pain assessment method using corrugator muscle activity with pressure pain stimulation to the skeletal muscle. Methods: Participants were 20 adults (a mean ± SD age of 22.0 ± 3.1 years) with chronic neck/shoulder pain. Surface electromyography (sEMG) of corrugator muscle activity at rest (baseline) and without and with pressure pain stimulation applied to the most painful tender point in the shoulder was recorded. Participants evaluated the intensity of the neck/shoulder pain and the sensory and affective components of pain with pressure stimulation using a visual analogue scale (VAS). The percentages of integrated sEMG (% corrugator activity) without and with pressure pain stimulation to the baseline integrated sEMG were compared, and the relationships between the % corrugator activity and the sensory and affective components of pain VAS scores were evaluated. Results: Without pressure stimulation, an increase in corrugator muscle activity due to chronic neck/shoulder pain was not observed. The % corrugator activity with pressure pain stimulation was significantly higher than that without stimulation (p < 0.01). A significant positive correlation between corrugator muscle activity and the affective components of pain VAS scores with pressure stimulation was found (ρ = 0.465, p = 0.039) and a tendency of positive correlation was found for the sensory component of pain VAS scores (ρ = 0.423, p = 0.063). Conclusions: The increase in corrugator muscle activity with pressure pain stimulation to the tender point in adults with chronic neck/shoulder pain was observed, although increased corrugator muscle activity resulting from the chronic neck/shoulder pain was not. These findings suggest that corrugator muscle activity with pressure pain stimulation can be a useful objective indication for tender point sensitivity assessment in the skeletal muscle with pain.


Assuntos
Cervicalgia , Dor de Ombro , Adulto , Humanos , Adolescente , Adulto Jovem , Dor de Ombro/etiologia , Músculo Esquelético/fisiologia , Pescoço , Eletromiografia
20.
Zhejiang Da Xue Xue Bao Yi Xue Ban ; 53(2): 254-260, 2024 Apr 25.
Artigo em Inglês, Zh | MEDLINE | ID: mdl-38650447

RESUMO

Attention deficit and hyperactive disorder (ADHD) is a chronic neurodevelopmental disorder characterized by inattention, hyperactivity-impulsivity, and working memory deficits. Social dysfunction is one of the major challenges faced by children with ADHD. It has been found that children with ADHD can't perform as well as typically developing children on facial expression recognition (FER) tasks. Generally, children with ADHD have some difficulties in FER, while some studies suggest that they have no significant differences in accuracy of specific emotion recognition compared with typically developing children. The neuropsychological mechanisms underlying these difficulties are as follows. First, neuroanatomically. Compared to typically developing children, children with ADHD show smaller gray matter volume and surface area in the amygdala and medial prefrontal cortex regions, as well as reduced density and volume of axons/cells in certain frontal white matter fiber tracts. Second, neurophysiologically. Children with ADHD exhibit increased slow-wave activity in their electroencephalogram, and event-related potential studies reveal abnormalities in emotional regulation and responses to angry faces when facing facial stimuli. Third, psychologically. Psychosocial stressors may influence FER abilities in children with ADHD, and sleep deprivation in ADHD children may significantly increase their recognition threshold for negative expressions such as sadness and anger. This article reviews research progress over the past three years on FER abilities of children with ADHD, analyzing the FER deficit in children with ADHD from three dimensions: neuroanatomy, neurophysiology and psychology, aiming to provide new perspectives for further research and clinical treatment of ADHD.


Assuntos
Transtorno do Deficit de Atenção com Hiperatividade , Expressão Facial , Humanos , Transtorno do Deficit de Atenção com Hiperatividade/fisiopatologia , Transtorno do Deficit de Atenção com Hiperatividade/psicologia , Criança , Reconhecimento Facial/fisiologia , Emoções
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA