Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 15.414
Filtrar
1.
Sci Rep ; 14(1): 10371, 2024 05 06.
Artigo em Inglês | MEDLINE | ID: mdl-38710806

RESUMO

Emotion is a human sense that can influence an individual's life quality in both positive and negative ways. The ability to distinguish different types of emotion can lead researchers to estimate the current situation of patients or the probability of future disease. Recognizing emotions from images have problems concealing their feeling by modifying their facial expressions. This led researchers to consider Electroencephalography (EEG) signals for more accurate emotion detection. However, the complexity of EEG recordings and data analysis using conventional machine learning algorithms caused inconsistent emotion recognition. Therefore, utilizing hybrid deep learning models and other techniques has become common due to their ability to analyze complicated data and achieve higher performance by integrating diverse features of the models. However, researchers prioritize models with fewer parameters to achieve the highest average accuracy. This study improves the Convolutional Fuzzy Neural Network (CFNN) for emotion recognition using EEG signals to achieve a reliable detection system. Initially, the pre-processing and feature extraction phases are implemented to obtain noiseless and informative data. Then, the CFNN with modified architecture is trained to classify emotions. Several parametric and comparative experiments are performed. The proposed model achieved reliable performance for emotion recognition with an average accuracy of 98.21% and 98.08% for valence (pleasantness) and arousal (intensity), respectively, and outperformed state-of-the-art methods.


Assuntos
Eletroencefalografia , Emoções , Lógica Fuzzy , Redes Neurais de Computação , Humanos , Eletroencefalografia/métodos , Emoções/fisiologia , Masculino , Feminino , Adulto , Algoritmos , Adulto Jovem , Processamento de Sinais Assistido por Computador , Aprendizado Profundo , Expressão Facial
2.
PLoS One ; 19(5): e0302782, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38713700

RESUMO

Parents with a history of childhood maltreatment may be more likely to respond inadequately to their child's emotional cues, such as crying or screaming, due to previous exposure to prolonged stress. While studies have investigated parents' physiological reactions to their children's vocal expressions of emotions, less attention has been given to their responses when perceiving children's facial expressions of emotions. The present study aimed to determine if viewing facial expressions of emotions in children induces cardiovascular changes in mothers (hypo- or hyper-arousal) and whether these differ as a function of childhood maltreatment. A total of 104 mothers took part in this study. Their experiences of childhood maltreatment were measured using the Childhood Trauma Questionnaire (CTQ). Participants' electrocardiogram signals were recorded during a task in which they viewed a landscape video (baseline) and images of children's faces expressing different intensities of emotion. Heart rate variability (HRV) was extracted from the recordings as an indicator of parasympathetic reactivity. Participants presented two profiles: one group of mothers had a decreased HRV when presented with images of children's facial expressions of emotions, while the other group's HRV increased. However, HRV change was not significantly different between the two groups. The interaction between HRV groups and the severity of maltreatment experienced was marginal. Results suggested that experiences of childhood emotional abuse were more common in mothers whose HRV increased during the task. Therefore, more severe childhood experiences of emotional abuse could be associated with mothers' cardiovascular hyperreactivity. Maladaptive cardiovascular responses could have a ripple effect, influencing how mothers react to their children's facial expressions of emotions. That reaction could affect the quality of their interaction with their child. Providing interventions that help parents regulate their physiological and behavioral responses to stress might be helpful, especially if they have experienced childhood maltreatment.


Assuntos
Emoções , Expressão Facial , Frequência Cardíaca , Mães , Humanos , Feminino , Adulto , Frequência Cardíaca/fisiologia , Criança , Emoções/fisiologia , Mães/psicologia , Abuso Emocional/psicologia , Masculino , Eletrocardiografia , Maus-Tratos Infantis/psicologia , Relações Mãe-Filho/psicologia , Inquéritos e Questionários
3.
Sci Rep ; 14(1): 10607, 2024 05 08.
Artigo em Inglês | MEDLINE | ID: mdl-38719866

RESUMO

Guilt is a negative emotion elicited by realizing one has caused actual or perceived harm to another person. One of guilt's primary functions is to signal that one is aware of the harm that was caused and regrets it, an indication that the harm will not be repeated. Verbal expressions of guilt are often deemed insufficient by observers when not accompanied by nonverbal signals such as facial expression, gesture, posture, or gaze. Some research has investigated isolated nonverbal expressions in guilt, however none to date has explored multiple nonverbal channels simultaneously. This study explored facial expression, gesture, posture, and gaze during the real-time experience of guilt when response demands are minimal. Healthy adults completed a novel task involving watching videos designed to elicit guilt, as well as comparison emotions. During the video task, participants were continuously recorded to capture nonverbal behaviour, which was then analyzed via automated facial expression software. We found that while feeling guilt, individuals engaged less in several nonverbal behaviours than they did while experiencing the comparison emotions. This may reflect the highly social aspect of guilt, suggesting that an audience is required to prompt a guilt display, or may suggest that guilt does not have clear nonverbal correlates.


Assuntos
Expressão Facial , Culpa , Humanos , Masculino , Feminino , Adulto , Adulto Jovem , Comunicação não Verbal/psicologia , Emoções/fisiologia , Gestos
4.
Sci Rep ; 14(1): 10491, 2024 05 07.
Artigo em Inglês | MEDLINE | ID: mdl-38714729

RESUMO

Dogs (Canis lupus familiaris) are the domestically bred descendant of wolves (Canis lupus). However, selective breeding has profoundly altered facial morphologies of dogs compared to their wolf ancestors. We demonstrate that these morphological differences limit the abilities of dogs to successfully produce the same affective facial expressions as wolves. We decoded facial movements of captive wolves during social interactions involving nine separate affective states. We used linear discriminant analyses to predict affective states based on combinations of facial movements. The resulting confusion matrix demonstrates that specific combinations of facial movements predict nine distinct affective states in wolves; the first assessment of this many affective facial expressions in wolves. However, comparative analyses with kennelled rescue dogs revealed reduced ability to predict affective states. Critically, there was a very low predictive power for specific affective states, with confusion occurring between negative and positive states, such as Friendly and Fear. We show that the varying facial morphologies of dogs (specifically non-wolf-like morphologies) limit their ability to produce the same range of affective facial expressions as wolves. Confusion among positive and negative states could be detrimental to human-dog interactions, although our analyses also suggest dogs likely use vocalisations to compensate for limitations in facial communication.


Assuntos
Domesticação , Emoções , Expressão Facial , Lobos , Animais , Lobos/fisiologia , Cães , Emoções/fisiologia , Masculino , Feminino , Comportamento Animal/fisiologia , Humanos
5.
Cereb Cortex ; 34(5)2024 May 02.
Artigo em Inglês | MEDLINE | ID: mdl-38715407

RESUMO

Facial palsy can result in a serious complication known as facial synkinesis, causing both physical and psychological harm to the patients. There is growing evidence that patients with facial synkinesis have brain abnormalities, but the brain mechanisms and underlying imaging biomarkers remain unclear. Here, we employed functional magnetic resonance imaging (fMRI) to investigate brain function in 31 unilateral post facial palsy synkinesis patients and 25 healthy controls during different facial expression movements and at rest. Combining surface-based mass-univariate analysis and multivariate pattern analysis, we identified diffused activation and intrinsic connection patterns in the primary motor cortex and the somatosensory cortex on the patient's affected side. Further, we classified post facial palsy synkinesis patients from healthy subjects with favorable accuracy using the support vector machine based on both task-related and resting-state functional magnetic resonance imaging data. Together, these findings indicate the potential of the identified functional reorganizations to serve as neuroimaging biomarkers for facial synkinesis diagnosis.


Assuntos
Paralisia Facial , Imageamento por Ressonância Magnética , Sincinesia , Humanos , Imageamento por Ressonância Magnética/métodos , Paralisia Facial/fisiopatologia , Paralisia Facial/diagnóstico por imagem , Paralisia Facial/complicações , Masculino , Feminino , Sincinesia/fisiopatologia , Adulto , Pessoa de Meia-Idade , Adulto Jovem , Expressão Facial , Biomarcadores , Córtex Motor/fisiopatologia , Córtex Motor/diagnóstico por imagem , Mapeamento Encefálico , Córtex Somatossensorial/diagnóstico por imagem , Córtex Somatossensorial/fisiopatologia , Encéfalo/diagnóstico por imagem , Encéfalo/fisiopatologia , Máquina de Vetores de Suporte
6.
J Psychiatry Neurosci ; 49(3): E145-E156, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38692692

RESUMO

BACKGROUND: Neuroimaging studies have revealed abnormal functional interaction during the processing of emotional faces in patients with major depressive disorder (MDD), thereby enhancing our comprehension of the pathophysiology of MDD. However, it is unclear whether there is abnormal directional interaction among face-processing systems in patients with MDD. METHODS: A group of patients with MDD and a healthy control group underwent a face-matching task during functional magnetic resonance imaging. Dynamic causal modelling (DCM) analysis was used to investigate effective connectivity between 7 regions in the face-processing systems. We used a Parametric Empirical Bayes model to compare effective connectivity between patients with MDD and controls. RESULTS: We included 48 patients and 44 healthy controls in our analyses. Both groups showed higher accuracy and faster reaction time in the shape-matching condition than in the face-matching condition. However, no significant behavioural or brain activation differences were found between the groups. Using DCM, we found that, compared with controls, patients with MDD showed decreased self-connection in the right dorsolateral prefrontal cortex (DLPFC), amygdala, and fusiform face area (FFA) across task conditions; increased intrinsic connectivity from the right amygdala to the bilateral DLPFC, right FFA, and left amygdala, suggesting an increased intrinsic connectivity centred in the amygdala in the right side of the face-processing systems; both increased and decreased positive intrinsic connectivity in the left side of the face-processing systems; and comparable task modulation effect on connectivity. LIMITATIONS: Our study did not include longitudinal neuroimaging data, and there was limited region of interest selection in the DCM analysis. CONCLUSION: Our findings provide evidence for a complex pattern of alterations in the face-processing systems in patients with MDD, potentially involving the right amygdala to a greater extent. The results confirm some previous findings and highlight the crucial role of the regions on both sides of face-processing systems in the pathophysiology of MDD.


Assuntos
Tonsila do Cerebelo , Transtorno Depressivo Maior , Reconhecimento Facial , Imageamento por Ressonância Magnética , Humanos , Transtorno Depressivo Maior/fisiopatologia , Transtorno Depressivo Maior/diagnóstico por imagem , Masculino , Feminino , Adulto , Reconhecimento Facial/fisiologia , Tonsila do Cerebelo/diagnóstico por imagem , Tonsila do Cerebelo/fisiopatologia , Encéfalo/diagnóstico por imagem , Encéfalo/fisiopatologia , Vias Neurais/fisiopatologia , Vias Neurais/diagnóstico por imagem , Teorema de Bayes , Adulto Jovem , Mapeamento Encefálico , Expressão Facial , Pessoa de Meia-Idade , Tempo de Reação/fisiologia
7.
Sci Rep ; 14(1): 11686, 2024 05 22.
Artigo em Inglês | MEDLINE | ID: mdl-38777852

RESUMO

Pain is rarely communicated alone, as it is often accompanied by emotions such as anger or sadness. Communicating these affective states involves shared representations. However, how an individual conceptually represents these combined states must first be tested. The objective of this study was to measure the interaction between pain and negative emotions on two types of facial representations of these states, namely visual (i.e., interactive virtual agents; VAs) and sensorimotor (i.e., one's production of facial configurations). Twenty-eight participants (15 women) read short written scenarios involving only pain or a combined experience of pain and a negative emotion (anger, disgust, fear, or sadness). They produced facial configurations representing these experiences on the faces of the VAs and on their face (own production or imitation of VAs). The results suggest that affective states related to a direct threat to the body (i.e., anger, disgust, and pain) share a similar facial representation, while those that present no immediate danger (i.e., fear and sadness) differ. Although visual and sensorimotor representations of these states provide congruent affective information, they are differently influenced by factors associated with the communication cycle. These findings contribute to our understanding of pain communication in different affective contexts.


Assuntos
Emoções , Expressão Facial , Dor , Humanos , Feminino , Masculino , Dor/psicologia , Dor/fisiopatologia , Adulto , Emoções/fisiologia , Adulto Jovem , Ira/fisiologia , Afeto/fisiologia , Medo/psicologia , Tristeza/psicologia
8.
Int J Med Inform ; 187: 105469, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38723429

RESUMO

BACKGROUND: Human Emotion Recognition (HER) has been a popular field of study in the past years. Despite the great progresses made so far, relatively little attention has been paid to the use of HER in autism. People with autism are known to face problems with daily social communication and the prototypical interpretation of emotional responses, which are most frequently exerted via facial expressions. This poses significant practical challenges to the application of regular HER systems, which are normally developed for and by neurotypical people. OBJECTIVE: This study reviews the literature on the use of HER systems in autism, particularly with respect to sensing technologies and machine learning methods, as to identify existing barriers and possible future directions. METHODS: We conducted a systematic review of articles published between January 2011 and June 2023 according to the 2020 PRISMA guidelines. Manuscripts were identified through searching Web of Science and Scopus databases. Manuscripts were included when related to emotion recognition, used sensors and machine learning techniques, and involved children with autism, young, or adults. RESULTS: The search yielded 346 articles. A total of 65 publications met the eligibility criteria and were included in the review. CONCLUSIONS: Studies predominantly used facial expression techniques as the emotion recognition method. Consequently, video cameras were the most widely used devices across studies, although a growing trend in the use of physiological sensors was observed lately. Happiness, sadness, anger, fear, disgust, and surprise were most frequently addressed. Classical supervised machine learning techniques were primarily used at the expense of unsupervised approaches or more recent deep learning models. Studies focused on autism in a broad sense but limited efforts have been directed towards more specific disorders of the spectrum. Privacy or security issues were seldom addressed, and if so, at a rather insufficient level of detail.


Assuntos
Transtorno Autístico , Emoções , Expressão Facial , Aprendizado de Máquina , Humanos , Transtorno Autístico/psicologia , Criança
9.
Cogn Sci ; 48(5): e13451, 2024 05.
Artigo em Inglês | MEDLINE | ID: mdl-38742266

RESUMO

Anxiety shifts visual attention and perceptual mechanisms, preparing oneself to detect potentially threatening information more rapidly. Despite being demonstrated for threat-related social stimuli, such as fearful expressions, it remains unexplored if these effects encompass other social cues of danger, such as aggressive gestures/actions. To this end, we recruited a total of 65 participants and asked them to identify, as quickly and accurately as possible, potentially aggressive actions depicted by an agent. By introducing and manipulating the occurrence of electric shocks, we induced safe and threatening conditions. In addition, the association between electric shocks and aggression was also manipulated. Our result showed that participants have improved sensitivity, with no changes to criterion, when detecting aggressive gestures during threat compared to safe conditions. Furthermore, drift diffusion model analysis showed that under threat participants exhibited faster evidence accumulation toward the correct perceptual decision. Lastly, the relationship between threat source and aggression appeared to not impact any of the effects described above. Overall, our results indicate that the benefits gained from states of anxiety, such as increased sensitivity toward threat and greater evidence accumulation, are transposable to social stimuli capable of signaling danger other than facial expressions.


Assuntos
Agressão , Medo , Humanos , Agressão/psicologia , Masculino , Feminino , Adulto Jovem , Adulto , Ansiedade/psicologia , Percepção Social , Atenção , Expressão Facial , Sinais (Psicologia) , Eletrochoque
10.
BMC Psychol ; 12(1): 279, 2024 May 17.
Artigo em Inglês | MEDLINE | ID: mdl-38755731

RESUMO

OBJECTIVE: The somatic symptom disorder (SSD) is characterized by one or more distressing or disabling somatic symptoms accompanied by an excessive amount of time, energy and emotion related to the symptoms. These manifestations of SSD have been linked to alterations in perception and appraisal of bodily signals. We hypothesized that SSD patients would exhibit changes in interoceptive accuracy (IA), particularly when emotional processing is involved. METHODS: Twenty-three patients with SSD and 20 healthy controls were recruited. IA was assessed using the heartbeat perception task. The task was performed in the absence of stimuli as well as in the presence of emotional interference, i.e., photographs of faces with an emotional expression. IA were examined for correlation with measures related to their somatic symptoms, including resting-state heart rate variability (HRV). RESULTS: There was no significant difference in the absolute values of IA between patients with SSD and healthy controls, regardless of the condition. However, the degree of difference in IA without emotional interference and with neutral facial interference was greater in patients with SSD than in healthy controls (p = 0.039). The IA of patients with SSD also showed a significant correlation with low-frequency HRV (p = 0.004) and high-frequency HRV (p = 0.007). CONCLUSION: SSD patients showed more significant changes in IA when neutral facial interference was given. These results suggest that bodily awareness is more affected by emotionally ambiguous stimuli in SSD patients than in healthy controls.


Assuntos
Emoções , Frequência Cardíaca , Interocepção , Humanos , Feminino , Masculino , Interocepção/fisiologia , Adulto , Frequência Cardíaca/fisiologia , Emoções/fisiologia , Pessoa de Meia-Idade , Sintomas Inexplicáveis , Transtornos Somatoformes/psicologia , Transtornos Somatoformes/fisiopatologia , Expressão Facial
11.
PLoS One ; 19(5): e0302705, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38758739

RESUMO

Neuropsychological research aims to unravel how diverse individuals' brains exhibit similar functionality when exposed to the same stimuli. The evocation of consistent responses when different subjects watch the same emotionally evocative stimulus has been observed through modalities like fMRI, EEG, physiological signals and facial expressions. We refer to the quantification of these shared consistent signals across subjects at each time instant across the temporal dimension as Consistent Response Measurement (CRM). CRM is widely explored through fMRI, occasionally with EEG, physiological signals and facial expressions using metrics like Inter-Subject Correlation (ISC). However, fMRI tools are expensive and constrained, while EEG and physiological signals are prone to facial artifacts and environmental conditions (such as temperature, humidity, and health condition of subjects). In this research, facial expression videos are used as a cost-effective and flexible alternative for CRM, minimally affected by external conditions. By employing computer vision-based automated facial keypoint tracking, a new metric similar to ISC, called the Average t-statistic, is introduced. Unlike existing facial expression-based methodologies that measure CRM of secondary indicators like inferred emotions, keypoint, and ICA-based features, the Average t-statistic is closely associated with the direct measurement of consistent facial muscle movement using the Facial Action Coding System (FACS). This is evidenced in DISFA dataset where the time-series of Average t-statistic has a high correlation (R2 = 0.78) with a metric called AU consistency, which directly measures facial muscle movement through FACS coding of video frames. The simplicity of recording facial expressions with the automated Average t-statistic expands the applications of CRM such as measuring engagement in online learning, customer interactions, etc., and diagnosing outliers in healthcare conditions like stroke, autism, depression, etc. To promote further research, we have made the code repository publicly available.


Assuntos
Emoções , Expressão Facial , Humanos , Emoções/fisiologia , Feminino , Masculino , Adulto , Gravação em Vídeo , Movimento/fisiologia , Adulto Jovem , Imageamento por Ressonância Magnética/métodos , Eletroencefalografia/métodos
12.
Autism Res ; 17(5): 934-946, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38716802

RESUMO

Autistic people exhibit atypical use of prior information when processing simple perceptual stimuli; yet, it remains unclear whether and how these difficulties in using priors extend to complex social stimuli. Here, we compared autistic people without accompanying intellectual disability and nonautistic people in their ability to acquire an "emotional prior" of a facial expression and update this prior to a different facial expression of the same identity. Participants performed a two-interval same/different discrimination task between two facial expressions. To study the acquisition of the prior, we examined how discrimination was modified by the contraction of the perceived facial expressions toward the average of presented stimuli (i.e., regression to the mean). At first, facial expressions surrounded one average emotional prior (mostly sad or angry), and then the average switched (to mostly angry or sad, accordingly). Autistic people exhibited challenges in facial discrimination, and yet acquired the first prior, demonstrating typical regression-to-the-mean effects. However, unlike nonautistic people, autistic people did not update their perception to the second prior, suggesting they are less flexible in updating an acquired prior of emotional expressions. Our findings shed light on the perception of emotional expressions, one of the most pressing challenges in autism.


Assuntos
Ira , Transtorno Autístico , Expressão Facial , Humanos , Feminino , Masculino , Adulto , Ira/fisiologia , Transtorno Autístico/psicologia , Adulto Jovem , Aprendizagem/fisiologia , Percepção Social , Adolescente , Emoções/fisiologia , Discriminação Psicológica/fisiologia
13.
Food Res Int ; 183: 114158, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38760149

RESUMO

The elderly population holds significance among consumers because many of them experience alterations in taste and smell or suffer from physical disorders. These factors can lead to reduced food intake, malnutrition, and, consequently, serious health problems. Therefore, there is a need to develop tailored products for seniors, offering both nutrition and appealing foods with easily consumable textures. Among the various characteristics of food, appearance stands out as one of the most critical aspects influencing food preferences and choices. Surprisingly, there is limited knowledge about how food shape affects the holistic emotional responses of seniors. The objective of this study was to investigate the impact of food shape on the emotional responses of seniors. This exploration involved the use of explicit methods, such as self-reported questionnaires, and implicit methods, including the measurement of skin conductance responses and facial expressions, as well as their combination. To achieve this goal, we enlisted the participation of 50 individuals (54 % women) from the senior population aged between 55 and 75 years. These participants evaluated two food products with identical sensory characteristics in terms of taste, texture, and flavor. However, these products differed in terms of their shape. We measured their degree of liking and emotional responses using a 7-point hedonic scale, EsSense25, in conjunction with galvanic skin response, and facial expressions, which served as representatives of behavioural and physiological responses. The multivariate analysis allowed to examine sample configurations by gender and establish associations between variables. The combination of implicit and explicit methods led to better discrimination of samples of the same category than the use of each of the methods independently. Although both samples elicited equivalent liking perceptions, they evoked distinct emotional responses, measured at cognitive, physiological, and behavioural levels. In general, men and women experienced different emotions while observing, smelling, handling, or consuming both samples, both consciously and unconsciously. This newfound knowledge could be valuable when designing food products for this demographic. The ultimate goal is to engage consumers and enhance their enjoyment of the food experience by offering more visually appealing food options.


Assuntos
Emoções , Preferências Alimentares , Humanos , Feminino , Masculino , Idoso , Pessoa de Meia-Idade , Preferências Alimentares/fisiologia , Preferências Alimentares/psicologia , Expressão Facial , Resposta Galvânica da Pele/fisiologia , Paladar , Inquéritos e Questionários
14.
Comput Methods Programs Biomed ; 250: 108195, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38692251

RESUMO

BACKGROUND AND OBJECTIVE: Timely stroke treatment can limit brain damage and improve outcomes, which depends on early recognition of the symptoms. However, stroke cases are often missed by the first respondent paramedics. One of the earliest external symptoms of stroke is based on facial expressions. METHODS: We propose a computerized analysis of facial expressions using action units to distinguish between Post-Stroke and healthy people. Action units enable analysis of subtle and specific facial movements and are interpretable to the facial expressions. The RGB videos from the Toronto Neuroface Dataset, which were recorded during standard orofacial examinations of 14 people with post-stroke (PS) and 11 healthy controls (HC) were used in this study. Action units were computed using XGBoost which was trained using HC, and classified using regression analysis for each of the nine facial expressions. The analysis was performed without manual intervention. RESULTS: The results were evaluated using leave-one-our validation. The accuracy was 82% for Kiss and Spread, with the best sensitivity of 91% in the differentiation of PS and HC. The features corresponding to mouth muscles were most suitable. CONCLUSIONS: This pilot study has shown that our method can detect PS based on two simple facial expressions. However, this needs to be tested in real-world conditions, with people of different ethnicities and smartphone use. The method has the potential for a computerized assessment of the videos for use by the first respondents using a smartphone to perform screening tests, which can facilitate the timely start of the treatment.


Assuntos
Expressão Facial , Acidente Vascular Cerebral , Humanos , Projetos Piloto , Feminino , Masculino , Pessoa de Meia-Idade , Idoso , Estudos de Casos e Controles , Gravação em Vídeo
15.
Sci Rep ; 14(1): 11571, 2024 05 21.
Artigo em Inglês | MEDLINE | ID: mdl-38773125

RESUMO

This study delves into expressing primary emotions anger, happiness, sadness, and fear through drawings. Moving beyond the well-researched color-emotion link, it explores under-examined aspects like spatial concepts and drawing styles. Employing Python and OpenCV for objective analysis, we make a breakthrough by converting subjective perceptions into measurable data through 728 digital images from 182 university students. For the prominent color chosen for each emotion, the majority of participants chose red for anger (73.11%), yellow for happiness (17.8%), blue for sadness (51.1%), and black for fear (40.7%). Happiness led with the highest saturation (68.52%) and brightness (75.44%) percentages, while fear recorded the lowest in both categories (47.33% saturation, 48.78% brightness). Fear, however, topped in color fill percentage (35.49%), with happiness at the lowest (25.14%). Tangible imagery prevailed (71.43-83.52%), with abstract styles peaking in fear representations (28.57%). Facial expressions were a common element (41.76-49.45%). The study achieved an 81.3% predictive accuracy for anger, higher than the 71.3% overall average. Future research can build on these results by improving technological methods to quantify more aspects of drawing content. Investigating a more comprehensive array of emotions and examining factors influencing emotional drawing styles will further our understanding of visual-emotional communication.


Assuntos
Emoções , Expressão Facial , Humanos , Emoções/fisiologia , Masculino , Feminino , Adulto Jovem , Felicidade , Ira/fisiologia , Adulto , Medo/psicologia , Tristeza
16.
Sci Rep ; 14(1): 11617, 2024 05 21.
Artigo em Inglês | MEDLINE | ID: mdl-38773183

RESUMO

It has been argued that experiencing the pain of others motivates helping. Here, we investigate the contribution of somatic feelings while witnessing the pain of others onto costly helping decisions, by contrasting the choices and brain activity of participants that report feeling somatic feelings (self-reported mirror-pain synesthetes) against those that do not. Participants in fMRI witnessed a confederate receiving pain stimulations whose intensity they could reduce by donating money. The pain intensity could be inferred either from the facial expressions of the confederate in pain (Face condition) or from the kinematics of the pain-receiving hand (Hand condition). Our results show that self-reported mirror-pain synesthetes increase their donation more steeply, as the intensity of the observed pain increases, and their somatosensory brain activity (SII and the adjacent IPL) was more tightly associated with donation in the Hand condition. For all participants, activation in insula, SII, TPJ, pSTS, amygdala and MCC correlated with the trial by trial donation made in the Face condition, while SI and MTG activation was correlated with the donation in the Hand condition. These results further inform us about the role of somatic feelings while witnessing the pain of others in situations of costly helping.


Assuntos
Imageamento por Ressonância Magnética , Dor , Humanos , Feminino , Masculino , Adulto , Dor/psicologia , Dor/fisiopatologia , Adulto Jovem , Encéfalo/fisiopatologia , Encéfalo/diagnóstico por imagem , Encéfalo/fisiologia , Mapeamento Encefálico , Expressão Facial , Comportamento de Ajuda , Mãos/fisiologia
17.
Cereb Cortex ; 34(4)2024 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-38566513

RESUMO

The perception of facial expression plays a crucial role in social communication, and it is known to be influenced by various facial cues. Previous studies have reported both positive and negative biases toward overweight individuals. It is unclear whether facial cues, such as facial weight, bias facial expression perception. Combining psychophysics and event-related potential technology, the current study adopted a cross-adaptation paradigm to examine this issue. The psychophysical results of Experiments 1A and 1B revealed a bidirectional cross-adaptation effect between overweight and angry faces. Adapting to overweight faces decreased the likelihood of perceiving ambiguous emotional expressions as angry compared to adapting to normal-weight faces. Likewise, exposure to angry faces subsequently caused normal-weight faces to appear thinner. These findings were corroborated by bidirectional event-related potential results, showing that adaptation to overweight faces relative to normal-weight faces modulated the event-related potential responses of emotionally ambiguous facial expression (Experiment 2A); vice versa, adaptation to angry faces relative to neutral faces modulated the event-related potential responses of ambiguous faces in facial weight (Experiment 2B). Our study provides direct evidence associating overweight faces with facial expression, suggesting at least partly common neural substrates for the perception of overweight and angry faces.


Assuntos
Expressão Facial , Preconceito de Peso , Humanos , Sobrepeso , Ira/fisiologia , Potenciais Evocados/fisiologia , Emoções/fisiologia
18.
PLoS One ; 19(4): e0301896, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38598520

RESUMO

This study investigates whether humans recognize different emotions conveyed only by the kinematics of a single moving geometrical shape and how this competence unfolds during development, from childhood to adulthood. To this aim, animations in which a shape moved according to happy, fearful, or neutral cartoons were shown, in a forced-choice paradigm, to 7- and 10-year-old children and adults. Accuracy and response times were recorded, and the movement of the mouse while the participants selected a response was tracked. Results showed that 10-year-old children and adults recognize happiness and fear when conveyed solely by different kinematics, with an advantage for fearful stimuli. Fearful stimuli were also accurately identified at 7-year-olds, together with neutral stimuli, while, at this age, the accuracy for happiness was not significantly different than chance. Overall, results demonstrates that emotions can be identified by a single point motion alone during both childhood and adulthood. Moreover, motion contributes in various measures to the comprehension of emotions, with fear recognized earlier in development and more readily even later on, when all emotions are accurately labeled.


Assuntos
Emoções , Expressão Facial , Adulto , Criança , Humanos , Fenômenos Biomecânicos , Emoções/fisiologia , Medo , Felicidade
19.
BMC Psychiatry ; 24(1): 307, 2024 Apr 23.
Artigo em Inglês | MEDLINE | ID: mdl-38654234

RESUMO

BACKGROUND: Obstructive sleep apnea-hypopnea syndrome (OSAHS) is a chronic breathing disorder characterized by recurrent upper airway obstruction during sleep. Although previous studies have shown a link between OSAHS and depressive mood, the neurobiological mechanisms underlying mood disorders in OSAHS patients remain poorly understood. This study aims to investigate the emotion processing mechanism in OSAHS patients with depressive mood using event-related potentials (ERPs). METHODS: Seventy-four OSAHS patients were divided into the depressive mood and non-depressive mood groups according to their Self-rating Depression Scale (SDS) scores. Patients underwent overnight polysomnography and completed various cognitive and emotional questionnaires. The patients were shown facial images displaying positive, neutral, and negative emotions and tasked to identify the emotion category, while their visual evoked potential was simultaneously recorded. RESULTS: The two groups did not differ significantly in age, BMI, and years of education, but showed significant differences in their slow wave sleep ratio (P = 0.039), ESS (P = 0.006), MMSE (P < 0.001), and MOCA scores (P = 0.043). No significant difference was found in accuracy and response time on emotional face recognition between the two groups. N170 latency in the depressive group was significantly longer than the non-depressive group (P = 0.014 and 0.007) at the bilateral parieto-occipital lobe, while no significant difference in N170 amplitude was found. No significant difference in P300 amplitude or latency between the two groups. Furthermore, N170 amplitude at PO7 was positively correlated with the arousal index and negatively with MOCA scores (both P < 0.01). CONCLUSION: OSAHS patients with depressive mood exhibit increased N170 latency and impaired facial emotion recognition ability. Special attention towards the depressive mood among OSAHS patients is warranted for its implications for patient care.


Assuntos
Depressão , Emoções , Apneia Obstrutiva do Sono , Humanos , Masculino , Pessoa de Meia-Idade , Apneia Obstrutiva do Sono/fisiopatologia , Apneia Obstrutiva do Sono/psicologia , Apneia Obstrutiva do Sono/complicações , Depressão/fisiopatologia , Depressão/psicologia , Depressão/complicações , Feminino , Adulto , Emoções/fisiologia , Polissonografia , Potenciais Evocados/fisiologia , Eletroencefalografia , Reconhecimento Facial/fisiologia , Potenciais Evocados Visuais/fisiologia , Expressão Facial
20.
PLoS One ; 19(4): e0290590, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38635525

RESUMO

Spontaneous smiles in response to politicians can serve as an implicit barometer for gauging electorate preferences. However, it is unclear whether a subtle Duchenne smile-an authentic expression involving the coactivation of the zygomaticus major (ZM) and orbicularis oculi (OO) muscles-would be elicited while reading about a favored politician smiling, indicating a more positive disposition and political endorsement. From an embodied simulation perspective, we investigated whether written descriptions of a politician's smile would trigger morphologically different smiles in readers depending on shared or opposing political orientation. In a controlled reading task in the laboratory, participants were presented with subject-verb phrases describing left and right-wing politicians smiling or frowning. Concurrently, their facial muscular reactions were measured via electromyography (EMG) recording at three facial muscles: the ZM and OO, coactive during Duchenne smiles, and the corrugator supercilii (CS) involved in frowning. We found that participants responded with a Duchenne smile detected at the ZM and OO facial muscles when exposed to portrayals of smiling politicians of same political orientation and reported more positive emotions towards these latter. In contrast, when reading about outgroup politicians smiling, there was a weaker activation of the ZM muscle and no activation of the OO muscle, suggesting a weak non-Duchenne smile, while emotions reported towards outgroup politicians were significantly more negative. Also, a more enhanced frown response in the CS was found for ingroup compared to outgroup politicians' frown expressions. Present findings suggest that a politician's smile may go a long way to influence electorates through both non-verbal and verbal pathways. They add another layer to our understanding of how language and social information shape embodied effects in a highly nuanced manner. Implications for verbal communication in the political context are discussed.


Assuntos
Fragilidade , Sorriso , Humanos , Sorriso/fisiologia , Leitura , Expressão Facial , Emoções/fisiologia , Músculos Faciais/fisiologia , Pálpebras
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA