Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 15.604
Filtrar
Más filtros

Intervalo de año de publicación
1.
Nature ; 622(7981): 130-138, 2023 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-37730990

RESUMEN

Deep brain stimulation (DBS) of the subcallosal cingulate (SCC) can provide long-term symptom relief for treatment-resistant depression (TRD)1. However, achieving stable recovery is unpredictable2, typically requiring trial-and-error stimulation adjustments due to individual recovery trajectories and subjective symptom reporting3. We currently lack objective brain-based biomarkers to guide clinical decisions by distinguishing natural transient mood fluctuations from situations requiring intervention. To address this gap, we used a new device enabling electrophysiology recording to deliver SCC DBS to ten TRD participants (ClinicalTrials.gov identifier NCT01984710). At the study endpoint of 24 weeks, 90% of participants demonstrated robust clinical response, and 70% achieved remission. Using SCC local field potentials available from six participants, we deployed an explainable artificial intelligence approach to identify SCC local field potential changes indicating the patient's current clinical state. This biomarker is distinct from transient stimulation effects, sensitive to therapeutic adjustments and accurate at capturing individual recovery states. Variable recovery trajectories are predicted by the degree of preoperative damage to the structural integrity and functional connectivity within the targeted white matter treatment network, and are matched by objective facial expression changes detected using data-driven video analysis. Our results demonstrate the utility of objective biomarkers in the management of personalized SCC DBS and provide new insight into the relationship between multifaceted (functional, anatomical and behavioural) features of TRD pathology, motivating further research into causes of variability in depression treatment.


Asunto(s)
Estimulación Encefálica Profunda , Depresión , Trastorno Depresivo Mayor , Humanos , Inteligencia Artificial , Biomarcadores , Estimulación Encefálica Profunda/métodos , Depresión/fisiopatología , Depresión/terapia , Trastorno Depresivo Mayor/fisiopatología , Trastorno Depresivo Mayor/terapia , Electrofisiología , Resultado del Tratamiento , Medición de Potencial de Campo Local , Sustancia Blanca , Lóbulo Límbico/fisiología , Lóbulo Límbico/fisiopatología , Expresión Facial
2.
Nature ; 589(7841): 251-257, 2021 01.
Artículo en Inglés | MEDLINE | ID: mdl-33328631

RESUMEN

Understanding the degree to which human facial expressions co-vary with specific social contexts across cultures is central to the theory that emotions enable adaptive responses to important challenges and opportunities1-6. Concrete evidence linking social context to specific facial expressions is sparse and is largely based on survey-based approaches, which are often constrained by language and small sample sizes7-13. Here, by applying machine-learning methods to real-world, dynamic behaviour, we ascertain whether naturalistic social contexts (for example, weddings or sporting competitions) are associated with specific facial expressions14 across different cultures. In two experiments using deep neural networks, we examined the extent to which 16 types of facial expression occurred systematically in thousands of contexts in 6 million videos from 144 countries. We found that each kind of facial expression had distinct associations with a set of contexts that were 70% preserved across 12 world regions. Consistent with these associations, regions varied in how frequently different facial expressions were produced as a function of which contexts were most salient. Our results reveal fine-grained patterns in human facial expressions that are preserved across the modern world.


Asunto(s)
Cultura , Emociones , Expresión Facial , Internacionalidad , Conducta Ceremonial , Aprendizaje Profundo , Mapeo Geográfico , Humanos , Cultura Popular , Traducciones
3.
Proc Natl Acad Sci U S A ; 121(14): e2313665121, 2024 Apr 02.
Artículo en Inglés | MEDLINE | ID: mdl-38530896

RESUMEN

Facial emotion expressions play a central role in interpersonal interactions; these displays are used to predict and influence the behavior of others. Despite their importance, quantifying and analyzing the dynamics of brief facial emotion expressions remains an understudied methodological challenge. Here, we present a method that leverages machine learning and network modeling to assess the dynamics of facial expressions. Using video recordings of clinical interviews, we demonstrate the utility of this approach in a sample of 96 people diagnosed with psychotic disorders and 116 never-psychotic adults. Participants diagnosed with schizophrenia tended to move from neutral expressions to uncommon expressions (e.g., fear, surprise), whereas participants diagnosed with other psychoses (e.g., mood disorders with psychosis) moved toward expressions of sadness. This method has broad applications to the study of normal and altered expressions of emotion and can be integrated with telemedicine to improve psychiatric assessment and treatment.


Asunto(s)
Trastornos Psicóticos , Esquizofrenia , Adulto , Humanos , Expresión Facial , Emociones , Esquizofrenia/diagnóstico , Miedo
4.
Proc Natl Acad Sci U S A ; 120(14): e2211966120, 2023 04 04.
Artículo en Inglés | MEDLINE | ID: mdl-36972456

RESUMEN

The face is a defining feature of our individuality, crucial for our social interactions. But what happens when the face connected to the self is radically altered or replaced? We address the plasticity of self-face recognition in the context of facial transplantation. While the acquisition of a new face following facial transplantation is a medical fact, the experience of a new identity is an unexplored psychological outcome. We traced the changes in self-face recognition before and after facial transplantation to understand if and how the transplanted face gradually comes to be perceived and recognized as the recipient's own new face. Neurobehavioral evidence documents a strong representation of the pre-injury appearance pre-operatively, while following the transplantation, the recipient incorporates the new face into his self-identity. The acquisition of this new facial identity is supported by neural activity in medial frontal regions that are considered to integrate psychological and perceptual aspects of the self.


Asunto(s)
Reconocimiento Facial , Trasplante Facial , Cara , Individualidad , Reconocimiento Visual de Modelos , Expresión Facial
5.
Proc Natl Acad Sci U S A ; 120(8): e2212735120, 2023 02 21.
Artículo en Inglés | MEDLINE | ID: mdl-36787369

RESUMEN

Faces in motion reveal a plethora of information through visual dynamics. Faces can move in complex patterns while transforming facial shape, e.g., during the generation of different emotional expressions. While motion and shape processing have been studied extensively in separate research enterprises, much less is known about their conjunction during biological motion. Here, we took advantage of the discovery in brain-imaging studies of an area in the dorsal portion of the macaque monkey superior temporal sulcus (STS), the middle dorsal face area (MD), with selectivity for naturalistic face motion. To gain mechanistic insights into the coding of facial motion, we recorded single-unit activity from MD, testing whether and how MD cells encode face motion. The MD population was highly sensitive to naturalistic facial motion and facial shape. Some MD cells responded only to the conjunction of facial shape and motion, others were selective for facial shape even without movement, and yet others were suppressed by facial motion. We found that this heterogeneous MD population transforms face motion into a higher dimensional activity space, a representation that would allow for high sensitivity to relevant small-scale movements. Indeed, we show that many MD cells carry such sensitivity for eye movements. We further found that MD cells encode motion of head, mouth, and eyes in a separable manner, requiring the use of multiple reference frames. Thus, MD is a bona fide face-motion area that uses highly heterogeneous cell populations to create codes capturing even complex facial motion trajectories.


Asunto(s)
Mapeo Encefálico , Imagen por Resonancia Magnética , Animales , Expresión Facial , Estimulación Luminosa , Lóbulo Temporal , Macaca
6.
J Neurosci ; 44(6)2024 Feb 07.
Artículo en Inglés | MEDLINE | ID: mdl-37963766

RESUMEN

The ventrolateral prefrontal cortex (VLPFC) shows robust activation during the perception of faces and voices. However, little is known about what categorical features of social stimuli drive neural activity in this region. Since perception of identity and expression are critical social functions, we examined whether neural responses to naturalistic stimuli were driven by these two categorical features in the prefrontal cortex. We recorded single neurons in the VLPFC, while two male rhesus macaques (Macaca mulatta) viewed short audiovisual videos of unfamiliar conspecifics making expressions of aggressive, affiliative, and neutral valence. Of the 285 neurons responsive to the audiovisual stimuli, 111 neurons had a main effect (two-way ANOVA) of identity, expression, or their interaction in their stimulus-related firing rates; however, decoding of expression and identity using single-unit firing rates rendered poor accuracy. Interestingly, when decoding from pseudo-populations of recorded neurons, the accuracy for both expression and identity increased with population size, suggesting that the population transmitted information relevant to both variables. Principal components analysis of mean population activity across time revealed that population responses to the same identity followed similar trajectories in the response space, facilitating segregation from other identities. Our results suggest that identity is a critical feature of social stimuli that dictates the structure of population activity in the VLPFC, during the perception of vocalizations and their corresponding facial expressions. These findings enhance our understanding of the role of the VLPFC in social behavior.


Asunto(s)
Corteza Prefrontal , Conducta Social , Animales , Masculino , Macaca mulatta , Corteza Prefrontal/fisiología , Neuronas/fisiología , Expresión Facial
7.
Bioinformatics ; 40(Suppl 1): i110-i118, 2024 06 28.
Artículo en Inglés | MEDLINE | ID: mdl-38940144

RESUMEN

Artificial intelligence (AI) is increasingly used in genomics research and practice, and generative AI has garnered significant recent attention. In clinical applications of generative AI, aspects of the underlying datasets can impact results, and confounders should be studied and mitigated. One example involves the facial expressions of people with genetic conditions. Stereotypically, Williams (WS) and Angelman (AS) syndromes are associated with a "happy" demeanor, including a smiling expression. Clinical geneticists may be more likely to identify these conditions in images of smiling individuals. To study the impact of facial expression, we analyzed publicly available facial images of approximately 3500 individuals with genetic conditions. Using a deep learning (DL) image classifier, we found that WS and AS images with non-smiling expressions had significantly lower prediction probabilities for the correct syndrome labels than those with smiling expressions. This was not seen for 22q11.2 deletion and Noonan syndromes, which are not associated with a smiling expression. To further explore the effect of facial expressions, we computationally altered the facial expressions for these images. We trained HyperStyle, a GAN-inversion technique compatible with StyleGAN2, to determine the vector representations of our images. Then, following the concept of InterfaceGAN, we edited these vectors to recreate the original images in a phenotypically accurate way but with a different facial expression. Through online surveys and an eye-tracking experiment, we examined how altered facial expressions affect the performance of human experts. We overall found that facial expression is associated with diagnostic accuracy variably in different genetic conditions.


Asunto(s)
Expresión Facial , Humanos , Aprendizaje Profundo , Inteligencia Artificial , Genética Médica/métodos , Síndrome de Williams/genética
8.
Mol Psychiatry ; 29(5): 1501-1509, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38278993

RESUMEN

Biased emotion processing has been suggested to underlie the etiology and maintenance of depression. Neuroimaging studies have shown mood-congruent alterations in amygdala activity in patients with acute depression, even during early, automatic stages of emotion processing. However, due to a lack of prospective studies over periods longer than 8 weeks, it is unclear whether these neurofunctional abnormalities represent a persistent correlate of depression even in remission. In this prospective case-control study, we aimed to examine brain functional correlates of automatic emotion processing in the long-term course of depression. In a naturalistic design, n = 57 patients with acute major depressive disorder (MDD) and n = 37 healthy controls (HC) were assessed with functional magnetic resonance imaging (fMRI) at baseline and after 2 years. Patients were divided into two subgroups according to their course of illness during the study period (n = 37 relapse, n = 20 no-relapse). During fMRI, participants underwent an affective priming task that assessed emotion processing of subliminally presented sad and happy compared to neutral face stimuli. A group × time × condition (3 × 2 × 2) ANOVA was performed for the amygdala as region-of-interest (ROI). At baseline, there was a significant group × condition interaction, resulting from amygdala hyperactivity to sad primes in patients with MDD compared to HC, whereas no difference between groups emerged for happy primes. In both patient subgroups, amygdala hyperactivity to sad primes persisted after 2 years, regardless of relapse or remission at follow-up. The results suggest that amygdala hyperactivity during automatic processing of negative stimuli persists during remission and represents a trait rather than a state marker of depression. Enduring neurofunctional abnormalities may reflect a consequence of or a vulnerability to depression.


Asunto(s)
Amígdala del Cerebelo , Trastorno Depresivo Mayor , Emociones , Imagen por Resonancia Magnética , Humanos , Amígdala del Cerebelo/fisiopatología , Masculino , Femenino , Adulto , Imagen por Resonancia Magnética/métodos , Trastorno Depresivo Mayor/fisiopatología , Emociones/fisiología , Estudios de Casos y Controles , Persona de Mediana Edad , Estudios Prospectivos , Expresión Facial , Depresión/fisiopatología , Mapeo Encefálico/métodos , Estimulación Subliminal
9.
Brain ; 147(9): 3018-3031, 2024 Sep 03.
Artículo en Inglés | MEDLINE | ID: mdl-38365267

RESUMEN

Simulation theories predict that the observation of other's expressions modulates neural activity in the same centres controlling their production. This hypothesis has been developed by two models, postulating that the visual input is directly projected either to the motor system for action recognition (motor resonance) or to emotional/interoceptive regions for emotional contagion and social synchronization (emotional resonance). Here we investigated the role of frontal/insular regions in the processing of observed emotional expressions by combining intracranial recording, electrical stimulation and effective connectivity. First, we intracranially recorded from prefrontal, premotor or anterior insular regions of 44 patients during the passive observation of emotional expressions, finding widespread modulations in prefrontal/insular regions (anterior cingulate cortex, anterior insula, orbitofrontal cortex and inferior frontal gyrus) and motor territories (Rolandic operculum and inferior frontal junction). Subsequently, we electrically stimulated the activated sites, finding that (i) in the anterior cingulate cortex and anterior insula, the stimulation elicited emotional/interoceptive responses, as predicted by the 'emotional resonance model'; (ii) in the Rolandic operculum it evoked face/mouth sensorimotor responses, in line with the 'motor resonance' model; and (iii) all other regions were unresponsive or revealed functions unrelated to the processing of facial expressions. Finally, we traced the effective connectivity to sketch a network-level description of these regions, finding that the anterior cingulate cortex and the anterior insula are reciprocally interconnected while the Rolandic operculum is part of the parieto-frontal circuits and poorly connected with the former. These results support the hypothesis that the pathways hypothesized by the 'emotional resonance' and the 'motor resonance' models work in parallel, differing in terms of spatio-temporal fingerprints, reactivity to electrical stimulation and connectivity patterns.


Asunto(s)
Emociones , Expresión Facial , Humanos , Emociones/fisiología , Masculino , Femenino , Adulto , Adulto Joven , Persona de Mediana Edad , Mapeo Encefálico/métodos , Estimulación Eléctrica , Corteza Insular/diagnóstico por imagen , Corteza Insular/fisiología , Imagen por Resonancia Magnética/métodos
10.
Cereb Cortex ; 34(2)2024 01 31.
Artículo en Inglés | MEDLINE | ID: mdl-38252995

RESUMEN

Automatic emotion counter-regulation refers to an unintentional attentional shift away from the current emotional state and toward information of the opposite valence. It is a useful emotion regulation skill that prevents the escalation of current emotional state. However, the cognitive mechanisms of emotion counter-regulation are not fully understood. Using a randomization approach, this study investigated how automatic emotion counter-regulation impacted attentional inhibition of emotional stimuli, an important aspect of emotion processing closely associated with emotion regulation and mental health. Forty-six university students were randomly assigned to an emotion counter-regulation group and a control group. The former group watched an anger-inducing video to evoke automatic emotion counter-regulation of anger, while the latter group watched an emotionally neutral video. Next, both groups completed a negative priming task of facial expressions with EEG recorded. In the emotion counter-regulation group, we observed an enhanced attentional inhibition of the angry, but not happy, faces, as indicated by a prolonger response time, a larger N2, and a smaller P3 in response to angry versus happy stimuli. These patterns were not observed in the control group, supporting the role of elicited emotion counter-regulation of anger in causing these modulation patterns in responses.


Asunto(s)
Regulación Emocional , Humanos , Ira/fisiología , Atención/fisiología , Emociones/fisiología , Expresión Facial , Felicidad
11.
Cereb Cortex ; 34(3)2024 03 01.
Artículo en Inglés | MEDLINE | ID: mdl-38466112

RESUMEN

Alexithymia is characterized by difficulties in emotional information processing. However, the underlying reasons for emotional processing deficits in alexithymia are not fully understood. The present study aimed to investigate the mechanism underlying emotional deficits in alexithymia. Using the Toronto Alexithymia Scale-20, we recruited college students with high alexithymia (n = 24) or low alexithymia (n = 24) in this study. Participants judged the emotional consistency of facial expressions and contextual sentences while recording their event-related potentials. Behaviorally, the high alexithymia group showed longer response times versus the low alexithymia group in processing facial expressions. The event-related potential results showed that the high alexithymia group had more negative-going N400 amplitudes compared with the low alexithymia group in the incongruent condition. More negative N400 amplitudes are also associated with slower responses to facial expressions. Furthermore, machine learning analyses based on N400 amplitudes could distinguish the high alexithymia group from the low alexithymia group in the incongruent condition. Overall, these findings suggest worse facial emotion perception for the high alexithymia group, potentially due to difficulty in spontaneously activating emotion concepts. Our findings have important implications for the affective science and clinical intervention of alexithymia-related affective disorders.


Asunto(s)
Síntomas Afectivos , Electroencefalografía , Humanos , Femenino , Masculino , Expresión Facial , Potenciales Evocados , Emociones
12.
Cereb Cortex ; 34(4)2024 Apr 01.
Artículo en Inglés | MEDLINE | ID: mdl-38566513

RESUMEN

The perception of facial expression plays a crucial role in social communication, and it is known to be influenced by various facial cues. Previous studies have reported both positive and negative biases toward overweight individuals. It is unclear whether facial cues, such as facial weight, bias facial expression perception. Combining psychophysics and event-related potential technology, the current study adopted a cross-adaptation paradigm to examine this issue. The psychophysical results of Experiments 1A and 1B revealed a bidirectional cross-adaptation effect between overweight and angry faces. Adapting to overweight faces decreased the likelihood of perceiving ambiguous emotional expressions as angry compared to adapting to normal-weight faces. Likewise, exposure to angry faces subsequently caused normal-weight faces to appear thinner. These findings were corroborated by bidirectional event-related potential results, showing that adaptation to overweight faces relative to normal-weight faces modulated the event-related potential responses of emotionally ambiguous facial expression (Experiment 2A); vice versa, adaptation to angry faces relative to neutral faces modulated the event-related potential responses of ambiguous faces in facial weight (Experiment 2B). Our study provides direct evidence associating overweight faces with facial expression, suggesting at least partly common neural substrates for the perception of overweight and angry faces.


Asunto(s)
Expresión Facial , Prejuicio de Peso , Humanos , Sobrepeso , Ira/fisiología , Potenciales Evocados/fisiología , Emociones/fisiología
13.
Cereb Cortex ; 34(1)2024 01 14.
Artículo en Inglés | MEDLINE | ID: mdl-37943770

RESUMEN

Empathic function, which is primarily manifested by facial imitation, is believed to play a pivotal role in interpersonal emotion regulation for mood reinstatement. To explore this association and its neural substrates, we performed a questionnaire survey (study l) to identify the relationship between empathy and interpersonal emotion regulation; and a task-mode fMRI study (study 2) to explore how facial imitation, as a fundamental component of empathic processes, promotes the interpersonal emotion regulation effect. Study 1 showed that affective empathy was positively correlated with interpersonal emotion regulation. Study 2 showed smaller negative emotions in facial imitation interpersonal emotion regulation (subjects imitated experimenter's smile while followed the interpersonal emotion regulation guidance) than in normal interpersonal emotion regulation (subjects followed the interpersonal emotion regulation guidance) and Watch conditions. Mirror neural system (e.g. inferior frontal gyrus and inferior parietal lobe) and empathy network exhibited greater activations in facial imitation interpersonal emotion regulation compared with normal interpersonal emotion regulation condition. Moreover, facial imitation interpersonal emotion regulation compared with normal interpersonal emotion regulation exhibited increased functional coupling from mirror neural system to empathic and affective networks during interpersonal emotion regulation. Furthermore, the connectivity of the right orbital inferior frontal gyrus-rolandic operculum lobe mediated the association between the accuracy of facial imitation and the interpersonal emotion regulation effect. These results show that the interpersonal emotion regulation effect can be enhanced by the target's facial imitation through increased functional coupling from mirror neural system to empathic and affective neural networks.


Asunto(s)
Regulación Emocional , Humanos , Mapeo Encefálico/métodos , Conducta Imitativa/fisiología , Imagen por Resonancia Magnética/métodos , Empatía , Neuroimagen Funcional , Emociones/fisiología , Expresión Facial
14.
Cereb Cortex ; 34(7)2024 Jul 03.
Artículo en Inglés | MEDLINE | ID: mdl-38990517

RESUMEN

Aberrations in non-verbal social cognition have been reported to coincide with major depressive disorder. Yet little is known about the role of the eyes. To fill this gap, the present study explores whether and, if so, how reading language of the eyes is altered in depression. For this purpose, patients and person-by-person matched typically developing individuals were administered the Emotions in Masked Faces task and Reading the Mind in the Eyes Test, modified, both of which contained a comparable amount of visual information available. For achieving group homogeneity, we set a focus on females as major depressive disorder displays a gender-specific profile. The findings show that facial masks selectively affect inferring emotions: recognition of sadness and anger are more heavily compromised in major depressive disorder as compared with typically developing controls, whereas the recognition of fear, happiness, and neutral expressions remains unhindered. Disgust, the forgotten emotion of psychiatry, is the least recognizable emotion in both groups. On the Reading the Mind in the Eyes Test patients exhibit lower accuracy on positive expressions than their typically developing peers, but do not differ on negative items. In both depressive and typically developing individuals, the ability to recognize emotions behind a mask and performance on the Reading the Mind in the Eyes Test are linked to each other in processing speed, but not recognition accuracy. The outcome provides a blueprint for understanding the complexities of reading language of the eyes within and beyond the COVID-19 pandemic.


Asunto(s)
Trastorno Depresivo Mayor , Emociones , Expresión Facial , Humanos , Femenino , Adulto , Emociones/fisiología , Trastorno Depresivo Mayor/psicología , Trastorno Depresivo Mayor/fisiopatología , Adulto Joven , Reconocimiento Facial/fisiología , Persona de Mediana Edad , COVID-19/psicología , Lectura
15.
Cereb Cortex ; 34(1)2024 01 14.
Artículo en Inglés | MEDLINE | ID: mdl-38112625

RESUMEN

The involvement of the human amygdala in facial mimicry remains a matter of debate. We investigated neural activity in the human amygdala during a task in which an imitation task was separated in time from an observation task involving facial expressions. Neural activity in the amygdala was measured using functional magnetic resonance imaging in 18 healthy individuals and using intracranial electroencephalogram in six medically refractory patients with epilepsy. The results of functional magnetic resonance imaging experiment showed that mimicry of negative and positive expressions activated the amygdala more than mimicry of non-emotional facial movements. In intracranial electroencephalogram experiment and time-frequency analysis, emotion-related activity of the amygdala during mimicry was observed as a significant neural oscillation in the high gamma band range. Furthermore, spectral event analysis of individual trial intracranial electroencephalogram data revealed that sustained oscillation of gamma band activity originated from an increased number and longer duration of neural events in the amygdala. Based on these findings, we conclude that during facial mimicry, visual information of expressions and feedback from facial movements are combined in the amygdalar nuclei. Considering the time difference of information approaching the amygdala, responses to facial movements are likely to modulate rather than initiate affective processing in human participants.


Asunto(s)
Electrocorticografía , Conducta Imitativa , Humanos , Emociones/fisiología , Amígdala del Cerebelo/diagnóstico por imagen , Amígdala del Cerebelo/fisiología , Imagen por Resonancia Magnética/métodos , Hemodinámica , Expresión Facial , Mapeo Encefálico/métodos
16.
Cereb Cortex ; 34(6)2024 Jun 04.
Artículo en Inglés | MEDLINE | ID: mdl-38884282

RESUMEN

Humanoid robots have been designed to look more and more like humans to meet social demands. How do people empathize humanoid robots who look the same as but are essentially different from humans? We addressed this issue by examining subjective feelings, electrophysiological activities, and functional magnetic resonance imaging signals during perception of pain and neutral expressions of faces that were recognized as patients or humanoid robots. We found that healthy adults reported deceased feelings of understanding and sharing of humanoid robots' compared to patients' pain. Moreover, humanoid robot (vs. patient) identities reduced long-latency electrophysiological responses and blood oxygenation level-dependent signals in the left temporoparietal junction in response to pain (vs. neutral) expressions. Furthermore, we showed evidence that humanoid robot identities inhibited a causal input from the right ventral lateral prefrontal cortex to the left temporoparietal junction, contrasting the opposite effect produced by patient identities. These results suggest a neural model of modulations of empathy by humanoid robot identity through interactions between the cognitive and affective empathy networks, which provides a neurocognitive basis for understanding human-robot interactions.


Asunto(s)
Mapeo Encefálico , Encéfalo , Empatía , Imagen por Resonancia Magnética , Robótica , Humanos , Empatía/fisiología , Masculino , Femenino , Imagen por Resonancia Magnética/métodos , Adulto , Adulto Joven , Encéfalo/diagnóstico por imagen , Encéfalo/fisiología , Mapeo Encefálico/métodos , Imagen Multimodal/métodos , Electroencefalografía , Expresión Facial , Dolor/psicología , Dolor/diagnóstico por imagen , Dolor/fisiopatología
17.
Cereb Cortex ; 34(5)2024 May 02.
Artículo en Inglés | MEDLINE | ID: mdl-38715407

RESUMEN

Facial palsy can result in a serious complication known as facial synkinesis, causing both physical and psychological harm to the patients. There is growing evidence that patients with facial synkinesis have brain abnormalities, but the brain mechanisms and underlying imaging biomarkers remain unclear. Here, we employed functional magnetic resonance imaging (fMRI) to investigate brain function in 31 unilateral post facial palsy synkinesis patients and 25 healthy controls during different facial expression movements and at rest. Combining surface-based mass-univariate analysis and multivariate pattern analysis, we identified diffused activation and intrinsic connection patterns in the primary motor cortex and the somatosensory cortex on the patient's affected side. Further, we classified post facial palsy synkinesis patients from healthy subjects with favorable accuracy using the support vector machine based on both task-related and resting-state functional magnetic resonance imaging data. Together, these findings indicate the potential of the identified functional reorganizations to serve as neuroimaging biomarkers for facial synkinesis diagnosis.


Asunto(s)
Parálisis Facial , Imagen por Resonancia Magnética , Sincinesia , Humanos , Imagen por Resonancia Magnética/métodos , Parálisis Facial/fisiopatología , Parálisis Facial/diagnóstico por imagen , Parálisis Facial/complicaciones , Masculino , Femenino , Sincinesia/fisiopatología , Adulto , Persona de Mediana Edad , Adulto Joven , Expresión Facial , Biomarcadores , Corteza Motora/fisiopatología , Corteza Motora/diagnóstico por imagen , Mapeo Encefálico , Corteza Somatosensorial/diagnóstico por imagen , Corteza Somatosensorial/fisiopatología , Encéfalo/diagnóstico por imagen , Encéfalo/fisiopatología , Máquina de Vectores de Soporte
18.
Proc Natl Acad Sci U S A ; 119(17): e2115228119, 2022 04 26.
Artículo en Inglés | MEDLINE | ID: mdl-35446619

RESUMEN

The diversity of human faces and the contexts in which they appear gives rise to an expansive stimulus space over which people infer psychological traits (e.g., trustworthiness or alertness) and other attributes (e.g., age or adiposity). Machine learning methods, in particular deep neural networks, provide expressive feature representations of face stimuli, but the correspondence between these representations and various human attribute inferences is difficult to determine because the former are high-dimensional vectors produced via black-box optimization algorithms. Here we combine deep generative image models with over 1 million judgments to model inferences of more than 30 attributes over a comprehensive latent face space. The predictive accuracy of our model approaches human interrater reliability, which simulations suggest would not have been possible with fewer faces, fewer judgments, or lower-dimensional feature representations. Our model can be used to predict and manipulate inferences with respect to arbitrary face photographs or to generate synthetic photorealistic face stimuli that evoke impressions tuned along the modeled attributes.


Asunto(s)
Expresión Facial , Juicio , Actitud , Cara , Humanos , Percepción Social , Confianza
19.
Proc Natl Acad Sci U S A ; 119(45): e2201380119, 2022 11 08.
Artículo en Inglés | MEDLINE | ID: mdl-36322724

RESUMEN

Emotional communication relies on a mutual understanding, between expresser and viewer, of facial configurations that broadcast specific emotions. However, we do not know whether people share a common understanding of how emotional states map onto facial expressions. This is because expressions exist in a high-dimensional space too large to explore in conventional experimental paradigms. Here, we address this by adapting genetic algorithms and combining them with photorealistic three-dimensional avatars to efficiently explore the high-dimensional expression space. A total of 336 people used these tools to generate facial expressions that represent happiness, fear, sadness, and anger. We found substantial variability in the expressions generated via our procedure, suggesting that different people associate different facial expressions to the same emotional state. We then examined whether variability in the facial expressions created could account for differences in performance on standard emotion recognition tasks by asking people to categorize different test expressions. We found that emotion categorization performance was explained by the extent to which test expressions matched the expressions generated by each individual. Our findings reveal the breadth of variability in people's representations of facial emotions, even among typical adult populations. This has profound implications for the interpretation of responses to emotional stimuli, which may reflect individual differences in the emotional category people attribute to a particular facial expression, rather than differences in the brain mechanisms that produce emotional responses.


Asunto(s)
Reconocimiento Facial , Individualidad , Adulto , Humanos , Expresión Facial , Emociones/fisiología , Ira/fisiología , Algoritmos
20.
J Neurosci ; 43(8): 1405-1413, 2023 02 22.
Artículo en Inglés | MEDLINE | ID: mdl-36690451

RESUMEN

Rapid detection of a threat or its symbol (e.g., fearful face), whether visible or invisible, is critical for human survival. This function is suggested to be enabled by a subcortical pathway to the amygdala independent of the cortex. However, conclusive electrophysiological evidence in humans is scarce. Here, we explored whether the amygdala can rapidly encode invisible fearful faces. We recorded intracranial electroencephalogram (iEEG) responses in the human (both sexes) amygdala to faces with fearful, happy, and neutral emotions rendered invisible by backward masking. We found that a short-latency intracranial event-related potential (iERP) in the amygdala, beginning 88 ms poststimulus onset, was preferentially evoked by invisible fearful faces relative to invisible happy or neutral faces. The rapid iERP exhibited selectivity to the low spatial frequency (LSF) component of the fearful faces. Time-frequency iEEG analyses further identified a rapid amygdala response preferentially for LSF fearful faces at the low gamma frequency band, beginning 45 ms poststimulus onset. In contrast, these rapid responses to invisible fearful faces were absent in cortical regions, including early visual areas, the fusiform gyrus, and the parahippocampal gyrus. These findings provide direct evidence for the existence of a subcortical pathway specific for rapid fear detection in the amygdala and demonstrate that the subcortical pathway can function without conscious awareness and under minimal influence from cortical areas.SIGNIFICANCE STATEMENT Automatic detection of biologically relevant stimuli, such as threats or dangers, has remarkable survival value. Here, we provide direct intracranial electrophysiological evidence that the human amygdala preferentially responds to fearful faces at a rapid speed, despite the faces being invisible. This rapid, fear-selective response is restricted to faces containing low spatial frequency information transmitted by magnocellular neurons and does not appear in cortical regions. These results support the existence of a rapid subcortical pathway independent of cortical pathways to the human amygdala.


Asunto(s)
Miedo , Imagen por Resonancia Magnética , Masculino , Femenino , Humanos , Miedo/fisiología , Emociones/fisiología , Felicidad , Amígdala del Cerebelo/fisiología , Expresión Facial
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA