Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 85
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Proc Natl Acad Sci U S A ; 120(21): e2214327120, 2023 05 23.
Artículo en Inglés | MEDLINE | ID: mdl-37186822

RESUMEN

Delusions of control in schizophrenia are characterized by the striking feeling that one's actions are controlled by external forces. We here tested qualitative predictions inspired by Bayesian causal inference models, which suggest that such misattributions of agency should lead to decreased intentional binding. Intentional binding refers to the phenomenon that subjects perceive a compression of time between their intentional actions and consequent sensory events. We demonstrate that patients with delusions of control perceived less self-agency in our intentional binding task. This effect was accompanied by significant reductions of intentional binding as compared to healthy controls and patients without delusions. Furthermore, the strength of delusions of control tightly correlated with decreases in intentional binding. Our study validated a critical prediction of Bayesian accounts of intentional binding, namely that a pathological reduction of the prior likelihood of a causal relation between one's actions and consequent sensory events-here captured by delusions of control-should lead to lesser intentional binding. Moreover, our study highlights the import of an intact perception of temporal contiguity between actions and their effects for the sense of agency.


Asunto(s)
Esquizofrenia , Percepción del Tiempo , Humanos , Desempeño Psicomotor , Teorema de Bayes , Emociones , Intención , Percepción
2.
Cereb Cortex ; 33(8): 4319-4333, 2023 04 04.
Artículo en Inglés | MEDLINE | ID: mdl-36137568

RESUMEN

Evidence accumulates that oral contraceptive (OC) use modulates various socio-affective behaviors, including empathic abilities. Endogenous and synthetic sex hormones, such as estrogens and progestogens, bind to receptor sites in brain regions (i.e. frontal, limbic, and cerebellar) involved in socio-affective processing. Therefore, the aim of this study was to investigate the role of OC use in empathy. In a cross-sectional functional magnetic resonance imaging study, women in different hormonal states, including OC use (n = 46) or being naturally cycling in the early follicular (fNC: n = 37) or peri-ovulatory phase (oNC: n = 28), performed a visual, sentence-based empathy task. Behaviorally, OC users had lower empathy ratings than oNC women. Congruently, whole-brain analysis revealed significantly larger task-related activation of several brain regions, including the left dorsomedial prefrontal gyrus (dmPFG), left precentral gyrus, and left temporoparietal junction in oNC compared to OC women. In OC users, the activity of the left dmPFG and precentral gyrus was negatively associated with behavioral and self-reported affective empathy. Furthermore, empathy-related region-of-interest analysis indicated negative associations of brain activation with synthetic hormone levels in OC women. Overall, this multimodal, cross-sectional investigation of empathy suggests a role of OC intake in especially affective empathy and highlights the importance of including synthetic hormone levels in OC-related analyses.


Asunto(s)
Anticonceptivos Orales , Empatía , Humanos , Femenino , Imagen por Resonancia Magnética , Estudios Transversales , Hormonas Esteroides Gonadales
3.
J Neural Transm (Vienna) ; 130(4): 585-596, 2023 04.
Artículo en Inglés | MEDLINE | ID: mdl-36808307

RESUMEN

Laughter plays an important role in group formation, signaling social belongingness by indicating a positive or negative social intention towards the receiver. In adults without autism, the intention of laughter can be correctly differentiated without further contextual information. In autism spectrum disorder (ASD), however, differences in the perception and interpretation of social cues represent a key characteristic of the disorder. Studies suggest that these differences are associated with hypoactivation and altered connectivity among key nodes of the social perception network. How laughter, as a multimodal nonverbal social cue, is perceived and processed neurobiologically in association with autistic traits has not been assessed previously. We investigated differences in social intention attribution, neurobiological activation, and connectivity during audiovisual laughter perception in association with the degree of autistic traits in adults [N = 31, Mage (SD) = 30.7 (10.0) years, nfemale = 14]. An attenuated tendency to attribute positive social intention to laughter was found with increasing autistic traits. Neurobiologically, autistic trait scores were associated with decreased activation in the right inferior frontal cortex during laughter perception and with attenuated connectivity between the bilateral fusiform face area with bilateral inferior and lateral frontal, superior temporal, mid-cingulate and inferior parietal cortices. Results support hypoactivity and hypoconnectivity during social cue processing with increasing ASD symptoms between socioemotional face processing nodes and higher-order multimodal processing regions related to emotion identification and attribution of social intention. Furthermore, results reflect the importance of specifically including signals of positive social intention in future studies in ASD.


Asunto(s)
Trastorno del Espectro Autista , Trastorno Autístico , Risa , Adulto , Humanos , Femenino , Mapeo Encefálico/métodos , Intención , Imagen por Resonancia Magnética/métodos , Percepción Social
4.
Hum Brain Mapp ; 41(2): 353-361, 2020 02 01.
Artículo en Inglés | MEDLINE | ID: mdl-31642167

RESUMEN

Laughter is a multifaceted signal, which can convey social acceptance facilitating social bonding as well as social rejection inflicting social pain. In the current study, we addressed the neural correlates of social intent attribution to auditory or visual laughter within an fMRI study to identify brain areas showing linear increases of activation with social intent ratings. Negative social intent attributions were associated with activation increases within the medial prefrontal cortex/anterior cingulate cortex (mPFC/ACC). Interestingly, negative social intent attributions of auditory laughter were represented more rostral than visual laughter within this area. Our findings corroborate the role of the mPFC/ACC as key node for processing "social pain" with distinct modality-specific subregions. Other brain areas that showed an increase of activation included bilateral inferior frontal gyrus and right superior/middle temporal gyrus (STG/MTG) for visually presented laughter and bilateral STG for auditory presented laughter with no overlap across modalities. Similarly, positive social intent attributions were linked to hemodynamic responses within the right inferior parietal lobe and right middle frontal gyrus, but there was no overlap of activity for visual and auditory laughter. Our findings demonstrate that social intent attribution to auditory and visual laughter is located in neighboring, but spatially distinct neural structures.


Asunto(s)
Percepción Auditiva/fisiología , Mapeo Encefálico , Giro del Cíngulo/fisiología , Risa , Corteza Prefrontal/fisiopatología , Percepción Social , Lóbulo Temporal/fisiología , Teoría de la Mente/fisiología , Percepción Visual/fisiología , Adulto , Femenino , Giro del Cíngulo/diagnóstico por imagen , Humanos , Intención , Imagen por Resonancia Magnética , Masculino , Persona de Mediana Edad , Corteza Prefrontal/diagnóstico por imagen , Lóbulo Temporal/diagnóstico por imagen , Adulto Joven
5.
Neuroimage ; 197: 450-456, 2019 08 15.
Artículo en Inglés | MEDLINE | ID: mdl-31075391

RESUMEN

Voices and faces are the most common sources of threat in social anxiety (SA) where the fear of negative evaluation and social exclusion is the central element. SA itself is spectrally distributed among the general population and its clinical manifestation, termed social anxiety disorder, is one of the most common anxiety disorders. While heightened cerebral responses to angry or contemptuous facial or vocal expressions are well documented, it remains unclear if the brain of socially anxious individuals is generally more sensitive to voices and faces. Using functional magnetic resonance imaging, we investigated how SA affects the cerebral processing of voices and faces as compared to various other stimulus types in a study population with greatly varying SA (N = 50, 26 female). While cerebral voice-sensitivity correlated positively with SA in the left temporal voice area (TVA) and the left amygdala, an association of face-sensitivity and SA was observed in the right fusiform face area (FFA) and the face processing area of the right posterior superior temporal sulcus (pSTSFA). These results demonstrate that the increase of cerebral responses associated with social anxiety is not limited to facial or vocal expressions of social threat but that the respective sensory and emotion processing structures are also generally tuned to voices and faces.


Asunto(s)
Trastornos de Ansiedad/fisiopatología , Ansiedad/fisiopatología , Percepción Auditiva/fisiología , Encéfalo/fisiopatología , Percepción Visual/fisiología , Adulto , Expresión Facial , Femenino , Humanos , Imagen por Resonancia Magnética , Masculino , Voz , Adulto Joven
6.
J Neural Transm (Vienna) ; 126(9): 1175-1185, 2019 09.
Artículo en Inglés | MEDLINE | ID: mdl-30498952

RESUMEN

Attention biases towards threat signals have been linked to the etiology and symptomatology of social anxiety disorder (SAD). Dysfunction of the dorsolateral prefrontal cortex (dlPFC) may contribute to attention biases in anxious individuals. The aim of this study was to investigate the feasibility of near-infrared spectroscopy (NIRS) neurofeedback (NF) training-targeting the dlPFC-and its effects on threat-related attention biases of individuals with SAD. 12 individuals with SAD participated in the NIRS-NF training lasting 6-8 weeks and including a total of 15 sessions. NF performance increased significantly, while the attention bias towards threat-related stimuli and SAD symptom severity decreased after the training. The individual increase in neurofeedback performance as well as the individual decrease in SAD symptom severity was correlated with decreased responses to social threat signals in the cerebral attention system. Thus, this pilot study does not only demonstrate that NIRS-based NF is feasible in SAD patients, but also may be a promising method to investigate the causal role of the dlPFC in attention biases in SAD. Its effectiveness as a treatment tool might be examined in future studies.


Asunto(s)
Sesgo Atencional , Reconocimiento Facial , Miedo , Neurorretroalimentación/métodos , Fobia Social/terapia , Corteza Prefrontal , Percepción Social , Espectroscopía Infrarroja Corta , Adulto , Sesgo Atencional/fisiología , Reconocimiento Facial/fisiología , Miedo/fisiología , Estudios de Factibilidad , Femenino , Humanos , Masculino , Fobia Social/fisiopatología , Proyectos Piloto , Corteza Prefrontal/fisiopatología , Resultado del Tratamiento , Adulto Joven
7.
Compr Psychiatry ; 88: 22-28, 2019 01.
Artículo en Inglés | MEDLINE | ID: mdl-30466014

RESUMEN

OBJECTIVE: The negative symptom domain remains a major challenge concerning treatment. A valid self-report measure could assist clinicians and researchers in identifying patients with a relevant subjective burden. The Motivation and Pleasure - Self Report (MAP-SR) derives from the CAINS and is supposed to reflect the "amotivation" factor of negative symptoms. We evaluated different aspects of the scale's reliability and validity. This is the first factorial analysis as well as the first analysis of test-retest reliability. METHODS: We assessed three samples of subjects with schizophrenia or schizoaffective disorder (n = 93) and a broad spectrum of related domains. RESULTS: We explored a 3-, 2- and 1-factor solution (explaining 50.93, 44.85 and 36.18% of variance, respectively). The factor "pleasure and hedonic activity" consists of eight items and was most robust; the factors "social motivation" and "motivation for work" were problematic. Test-retest reliability of the scale was adequate (rS = 0.63, p = .005). Neither the MAP-SR nor the "pleasure and hedonic activities" factor are associated with the PANSS negative symptom scale. There are significant associations with the observer-rated CAINS-MAP scale, experiences of pleasure, and social cognition but none with functional outcome. Discriminant validity could not be established with regards to depression and extrapyramidal symptoms. CONCLUSIONS: We found that the MAP-SR is adequate to assess anhedonia but is less suitable when assessing motivation. Therefore, we propose using the "pleasure and hedonic activity scale" to cover the "anhedonia" subdomain. We think the "motivation" part of the instrument requires reconstruction.


Asunto(s)
Anhedonia/fisiología , Motivación/fisiología , Placer/fisiología , Esquizofrenia/diagnóstico , Psicología del Esquizofrénico , Autoinforme/normas , Adulto , Femenino , Humanos , Masculino , Persona de Mediana Edad , Escalas de Valoración Psiquiátrica/normas , Reproducibilidad de los Resultados , Esquizofrenia/epidemiología , Autoevaluación (Psicología)
8.
Hum Brain Mapp ; 39(8): 3419-3427, 2018 08.
Artículo en Inglés | MEDLINE | ID: mdl-29682814

RESUMEN

Major depressive disorder (MDD) is characterized by a biased emotion perception. In the auditory domain, MDD patients have been shown to exhibit attenuated processing of positive emotions expressed by speech melody (prosody). So far, no neuroimaging studies examining the neural basis of altered processing of emotional prosody in MDD are available. In this study, we addressed this issue by examining the emotion bias in MDD during evaluation of happy, neutral, and angry prosodic stimuli on a five-point Likert scale during functional magnetic resonance imaging (fMRI). As expected, MDD patients rated happy prosody less intense than healthy controls (HC). At neural level, stronger activation in the middle superior temporal gyrus (STG) and the amygdala was found in all participants when processing emotional as compared to neutral prosody. MDD patients exhibited an increased activation of the amygdala during processing prosody irrespective of valence while no significant differences between groups were found for the STG, indicating that altered processing of prosodic emotions in MDD occurs rather within the amygdala than in auditory areas. Concurring with the valence-specific behavioral effect of attenuated evaluation of positive prosodic stimuli, activation within the left amygdala of MDD patients correlated with ratings of happy, but not neutral or angry prosody. Our study provides first insights in the neural basis of reduced experience of positive information and an abnormally increased amygdala activity during prosody processing.


Asunto(s)
Encéfalo/diagnóstico por imagen , Encéfalo/fisiopatología , Trastorno Depresivo Mayor/diagnóstico por imagen , Trastorno Depresivo Mayor/fisiopatología , Emociones/fisiología , Percepción del Habla/fisiología , Adulto , Mapeo Encefálico , Trastorno Depresivo Mayor/tratamiento farmacológico , Femenino , Humanos , Juicio/fisiología , Imagen por Resonancia Magnética , Masculino , Vías Nerviosas/diagnóstico por imagen , Vías Nerviosas/fisiopatología
9.
J Neural Transm (Vienna) ; 123(8): 937-47, 2016 08.
Artículo en Inglés | MEDLINE | ID: mdl-27094176

RESUMEN

People diagnosed with autism spectrum disorder (ASD) characteristically present with severe difficulties in interpreting every-day social signals. Currently it is assumed that these difficulties might have neurobiological correlates in alterations in activation as well as in connectivity in and between regions of the social perception network suggested to govern the processing of social cues. In this study, we conducted functional magnetic resonance imaging (fMRI)-based activation and connectivity analyses focusing on face-, voice-, and audiovisual-processing brain regions as the most important subareas of the social perception network. Results revealed alterations in connectivity among regions involved in the processing of social stimuli in ASD subjects compared to typically developed (TD) controls-specifically, a reduced connectivity between the left temporal voice area (TVA) and the superior and medial frontal gyrus. Alterations in connectivity, moreover, were correlated with the severity of autistic traits: correlation analysis indicated that the connectivity between the left TVA and the limbic lobe, anterior cingulate and the medial frontal gyrus as well as between the right TVA and the frontal lobe, anterior cingulate, limbic lobe and the caudate decreased with increasing symptom severity. As these frontal regions are understood to play an important role in interpreting and mentalizing social signals, the observed underconnectivity might be construed as playing a role in social impairments in ASD.


Asunto(s)
Trastorno del Espectro Autista/patología , Trastorno del Espectro Autista/psicología , Mapeo Encefálico , Señales (Psicología) , Lóbulo Frontal/fisiopatología , Vías Nerviosas/fisiología , Percepción Social , Adulto , Trastorno del Espectro Autista/diagnóstico por imagen , Mapeo Encefálico/métodos , Expresión Facial , Femenino , Lóbulo Frontal/diagnóstico por imagen , Movimientos de la Cabeza , Humanos , Procesamiento de Imagen Asistido por Computador , Imagen por Resonancia Magnética , Masculino , Vías Nerviosas/diagnóstico por imagen , Oxígeno/sangre , Estimulación Física , Adulto Joven
10.
J Neural Transm (Vienna) ; 123(8): 961-70, 2016 08.
Artículo en Inglés | MEDLINE | ID: mdl-26850439

RESUMEN

This study examined identification of emotional information in facial expression, prosody, and their combination in 23 adult patients with combined attention deficit-hyperactivity disorder (ADHD) versus 31 healthy controls (HC) matched for gender, age, and education. We employed a stimulus set which was carefully balanced for valence as well as recognizability of the expressed emotions as determined in an independent sample of HC to avoid potential biases due to different levels of task difficulty. ADHD patients were characterized by impaired recognition of all employed categories (neutral, happiness, eroticism, disgust, anger). Basic cognitive functions as assessed by neuropsychological testing, such as sustained attention, constancy of alertness, and verbal intelligence partially explained lower recognition rates. Removal of the correlated variance by means of regression analyses did not abolish lower performance in ADHD indicating deficits in social cognition independent of these neuropsychological factors (p < 0.05). Lower performance correlated with self-rated emotional intelligence (r = 0.38, p < 0.05) indicating that adults with ADHD are aware of their problems in emotion perception. ADHD patients could partly compensate their deficit in unimodal emotion perception by audiovisual integration as revealed by larger gains in emotion recognition accuracy during bimodal presentation (p < 0.05) as compared to HC. These behavioral results can serve as foundation for future neuroimaging studies and point rather towards sensory-specific regions than audiovisual integration areas in perception of emotional information in adult ADHD.


Asunto(s)
Trastorno por Déficit de Atención con Hiperactividad/fisiopatología , Trastorno por Déficit de Atención con Hiperactividad/psicología , Emociones/fisiología , Expresión Facial , Adolescente , Adulto , Estudios de Casos y Controles , Femenino , Humanos , Masculino , Estimulación Luminosa , Psicometría , Autoinforme , Conducta Social , Conducta Verbal/fisiología , Adulto Joven
11.
BMC Psychiatry ; 16: 218, 2016 07 07.
Artículo en Inglés | MEDLINE | ID: mdl-27388011

RESUMEN

BACKGROUND: Impaired interpretation of nonverbal emotional cues in patients with schizophrenia has been reported in several studies and a clinical relevance of these deficits for social functioning has been assumed. However, it is unclear to what extent the impairments depend on specific emotions or specific channels of nonverbal communication. METHODS: Here, the effect of cue modality and emotional categories on accuracy of emotion recognition was evaluated in 21 patients with schizophrenia and compared to a healthy control group (n = 21). To this end, dynamic stimuli comprising speakers of both genders in three different sensory modalities (auditory, visual and audiovisual) and five emotional categories (happy, alluring, neutral, angry and disgusted) were used. RESULTS: Patients with schizophrenia were found to be impaired in emotion recognition in comparison to the control group across all stimuli. Considering specific emotions more severe deficits were revealed in the recognition of alluring stimuli and less severe deficits in the recognition of disgusted stimuli as compared to all other emotions. Regarding cue modality the extent of the impairment in emotional recognition did not significantly differ between auditory and visual cues across all emotional categories. However, patients with schizophrenia showed significantly more severe disturbances for vocal as compared to facial cues when sexual interest is expressed (alluring stimuli), whereas more severe disturbances for facial as compared to vocal cues were observed when happiness or anger is expressed. CONCLUSION: Our results confirmed that perceptual impairments can be observed for vocal as well as facial cues conveying various social and emotional connotations. The observed differences in severity of impairments with most severe deficits for alluring expressions might be related to specific difficulties in recognizing the complex social emotional information of interpersonal intentions as compared to "basic" emotional states. Therefore, future studies evaluating perception of nonverbal cues should consider a broader range of social and emotional signals beyond basic emotions including attitudes and interpersonal intentions. Identifying specific domains of social perception particularly prone for misunderstandings in patients with schizophrenia might allow for a refinement of interventions aiming at improving social functioning.


Asunto(s)
Emociones , Comunicación no Verbal/psicología , Reconocimiento en Psicología , Psicología del Esquizofrénico , Estimulación Acústica , Adulto , Anciano , Estudios de Casos y Controles , Señales (Psicología) , Expresión Facial , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estimulación Luminosa , Adulto Joven
12.
Cereb Cortex ; 24(6): 1460-73, 2014 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-23382516

RESUMEN

Emotional information can be conveyed by verbal and nonverbal cues with the latter often suggested to exert a greater influence in shaping our perceptions of others. The present functional magnetic resonance imaging study sought to explore attentional biases toward nonverbal signals by investigating the interaction of verbal and nonverbal cues. Results obtained in this study underline the previous suggestions of a "nonverbal dominance" in emotion communication by evidencing implicit effects of nonverbal cues on emotion judgements even when attention is directed away from nonverbal signals and focused on verbal cues. Attentional biases toward nonverbal signals appeared to be reflected in increasing activation of the dorsolateral prefrontal cortex (DLPFC) assumed to reflect increasing difficulties to suppress nonverbal cues during task conditions that asked to shift attention away from nonverbal signals. Aside the DLPFC, results suggest the right amygdala to play a role in attention control mechanisms related to the processing of emotional cues. Analyses conducted to determine the cerebral correlates of the individual ability to shift attention between verbal and nonverbal sources of information indicated that higher task-switching abilities seem to be associated with the up-regulation of right amygdala activation during explicit judgments of nonverbal cues, whereas difficulties in task-switching seem to be related to a down-regulation.


Asunto(s)
Atención/fisiología , Percepción Auditiva/fisiología , Encéfalo/fisiología , Emociones , Percepción del Habla/fisiología , Percepción Visual/fisiología , Adulto , Amígdala del Cerebelo/fisiología , Mapeo Encefálico , Señales (Psicología) , Función Ejecutiva/fisiología , Expresión Facial , Femenino , Humanos , Juicio/fisiología , Imagen por Resonancia Magnética , Masculino , Pruebas Neuropsicológicas , Corteza Prefrontal/fisiología
13.
Cogn Emot ; 28(3): 452-69, 2014 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-24151963

RESUMEN

Results from studies on gender differences in emotion recognition vary, depending on the types of emotion and the sensory modalities used for stimulus presentation. This makes comparability between different studies problematic. This study investigated emotion recognition of healthy participants (N = 84; 40 males; ages 20 to 70 years), using dynamic stimuli, displayed by two genders in three different sensory modalities (auditory, visual, audio-visual) and five emotional categories. The participants were asked to categorise the stimuli on the basis of their nonverbal emotional content (happy, alluring, neutral, angry, and disgusted). Hit rates and category selection biases were analysed. Women were found to be more accurate in recognition of emotional prosody. This effect was partially mediated by hearing loss for the frequency of 8,000 Hz. Moreover, there was a gender-specific selection bias for alluring stimuli: Men, as compared to women, chose "alluring" more often when a stimulus was presented by a woman as compared to a man.


Asunto(s)
Percepción Auditiva , Emociones , Reconocimiento en Psicología , Caracteres Sexuales , Percepción Visual , Adulto , Afecto , Anciano , Nivel de Alerta , Atención , Femenino , Humanos , Masculino , Memoria a Corto Plazo , Persona de Mediana Edad , Adulto Joven
14.
Neuroimage ; 76: 45-56, 2013 Aug 01.
Artículo en Inglés | MEDLINE | ID: mdl-23507387

RESUMEN

It was the aim of this study to delineate the areas along the right superior temporal sulcus (STS) for processing of faces, voices, and face-voice integration using established functional magnetic resonance imaging (fMRI) localizers and to assess their structural connectivity profile with diffusion tensor imaging (DTI). We combined this approach with an fMRI adaptation design during which the participants judged emotions in facial expressions and prosody and demonstrated response habituation in the orbitofrontal cortex (OFC) which occurred irrespective of the sensory modality. These functional data were in line with DTI findings showing separable fiber projections of the three different STS modules converging in the OFC which run through the external capsule for the voice area, through the dorsal superior longitudinal fasciculus (SLF) for the face area and through the ventral SLF for the audiovisual integration area. The OFC was structurally connected with the supplementary motor area (SMA) and activation in these two areas was correlated with faster stimulus evaluation during repetition priming. Based on these structural and functional properties, we propose that the OFC is part of the extended system for perception of emotional information in faces and voices and constitutes a neural interface linking sensory areas with brain regions implicated in generation of behavioral responses.


Asunto(s)
Percepción Auditiva/fisiología , Mapeo Encefálico , Corteza Cerebral/fisiología , Vías Nerviosas/fisiología , Reconocimiento Visual de Modelos/fisiología , Imagen de Difusión por Resonancia Magnética , Cara , Femenino , Humanos , Interpretación de Imagen Asistida por Computador , Imagen por Resonancia Magnética , Masculino , Voz , Adulto Joven
15.
Cereb Cortex ; 22(1): 191-200, 2012 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-21625012

RESUMEN

We determined the location, functional response profile, and structural fiber connections of auditory areas with voice- and emotion-sensitive activity using functional magnetic resonance imaging (fMRI) and diffusion tensor imaging. Bilateral regions responding to emotional voices were consistently found in the superior temporal gyrus, posterolateral to the primary auditory cortex. Event-related fMRI showed stronger responses in these areas to voices-expressing anger, sadness, joy, and relief, relative to voices with neutral prosody. Their neural responses were primarily driven by prosodic arousal, irrespective of valence. Probabilistic fiber tracking revealed direct structural connections of these "emotional voice areas" (EVA) with ipsilateral medial geniculate body, which is the major input source of early auditory cortex, as well as with the ipsilateral inferior frontal gyrus (IFG) and inferior parietal lobe (IPL). In addition, vocal emotions (compared with neutral prosody) increased the functional coupling of EVA with the ipsilateral IFG but not IPL. These results provide new insights into the neural architecture of the human voice processing system and support a crucial involvement of IFG in the recognition of vocal emotions, whereas IPL may subserve distinct auditory spatial functions, consistent with distinct anatomical substrates for the processing of "how" and "where" information within the auditory pathways.


Asunto(s)
Vías Auditivas/irrigación sanguínea , Mapeo Encefálico , Encéfalo/irrigación sanguínea , Encéfalo/fisiología , Emociones/fisiología , Voz/fisiología , Estimulación Acústica , Adulto , Análisis de Varianza , Nivel de Alerta , Vías Auditivas/fisiología , Percepción Auditiva/fisiología , Imagen de Difusión Tensora , Femenino , Humanos , Procesamiento de Imagen Asistido por Computador , Imagen por Resonancia Magnética , Masculino , Fibras Nerviosas/fisiología , Oxígeno/sangre , Adulto Joven
16.
Cogn Emot ; 27(5): 783-99, 2013.
Artículo en Inglés | MEDLINE | ID: mdl-23134564

RESUMEN

Emotional communication uses verbal and nonverbal means. In case of conflicting signals, nonverbal information is assumed to have a stronger impact. It is unclear, however, whether perceptual nonverbal dominance varies between individuals and whether it is linked to emotional intelligence. Using audiovisual stimulus material comprising verbal and nonverbal emotional cues that were varied independently, perceptual nonverbal dominance profiles and their relations to emotional intelligence were examined. Nonverbal dominance was found in every participant, ranging from 55 to 100%. Moreover, emotional intelligence, particularly the ability to understand emotions, correlated positively with nonverbal dominance. Furthermore, higher overall emotional intelligence as well as a higher ability to understand emotions were linked to smaller reaction time differences between emotionally incongruent and congruent stimuli. The association between perceptual nonverbal dominance and emotional intelligence, and more specifically the ability to understand emotions, might reflect an adaptive process driven by the experience of higher authenticity in nonverbal cues.


Asunto(s)
Inteligencia Emocional , Comunicación no Verbal/psicología , Percepción Social , Adulto , Señales (Psicología) , Expresión Facial , Femenino , Humanos , Masculino , Tiempo de Reacción
17.
Front Psychiatry ; 14: 1151665, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37168084

RESUMEN

Introduction: Deficits in emotional perception are common in autistic people, but it remains unclear to which extent these perceptual impairments are linked to specific sensory modalities, specific emotions or multisensory facilitation. Methods: This study aimed to investigate uni- and bimodal perception of emotional cues as well as multisensory facilitation in autistic (n = 18, mean age: 36.72 years, SD: 11.36) compared to non-autistic (n = 18, mean age: 36.41 years, SD: 12.18) people using auditory, visual and audiovisual stimuli. Results: Lower identification accuracy and longer response time were revealed in high-functioning autistic people. These differences were independent of modality and emotion and showed large effect sizes (Cohen's d 0.8-1.2). Furthermore, multisensory facilitation of response time was observed in non-autistic people that was absent in autistic people, whereas no differences were found in multisensory facilitation of accuracy between the two groups. Discussion: These findings suggest that processing of auditory and visual components of audiovisual stimuli is carried out more separately in autistic individuals (with equivalent temporal demands required for processing of the respective unimodal cues), but still with similar relative improvement in accuracy, whereas earlier integrative multimodal merging of stimulus properties seems to occur in non-autistic individuals.

18.
Neuroimage ; 61(3): 738-47, 2012 Jul 02.
Artículo en Inglés | MEDLINE | ID: mdl-22516367

RESUMEN

Emotional communication is essential for successful social interactions. Emotional information can be expressed at verbal and nonverbal levels. If the verbal message contradicts the nonverbal expression, usually the nonverbal information is perceived as being more authentic, revealing the "true feelings" of the speaker. The present fMRI study investigated the cerebral integration of verbal (sentences expressing the emotional state of the speaker) and nonverbal (facial expressions and tone of voice) emotional signals using ecologically valid audiovisual stimulus material. More specifically, cerebral activation associated with the relative impact of nonverbal information on judging the affective state of a speaker (individual nonverbal dominance index, INDI) was investigated. Perception of nonverbally expressed emotions was associated with bilateral activation within the amygdala, fusiform face area (FFA), temporal voice area (TVA), and the posterior temporal cortex as well as in the midbrain and left inferior orbitofrontal cortex (OFC)/left insula. Verbally conveyed emotions were linked to increased responses bilaterally in the TVA. Furthermore, the INDI correlated with responses in the left amygdala elicited by nonverbal and verbal emotional stimuli. Correlation of the INDI with the activation within the medial OFC was observed during the processing of communicative signals. These results suggest that individuals with a higher degree of nonverbal dominance have an increased sensitivity not only to nonverbal but to emotional stimuli in general.


Asunto(s)
Encéfalo/fisiología , Señales (Psicología) , Dominancia Cerebral/fisiología , Emociones/fisiología , Estimulación Acústica , Adulto , Amígdala del Cerebelo/fisiología , Mapeo Encefálico , Circulación Cerebrovascular/fisiología , Comunicación , Emoción Expresada , Expresión Facial , Femenino , Humanos , Procesamiento de Imagen Asistido por Computador , Imagen por Resonancia Magnética , Masculino , Estimulación Luminosa , Corteza Prefrontal/fisiología , Habla , Lóbulo Temporal/fisiología , Voz/fisiología , Adulto Joven
19.
Sci Rep ; 12(1): 5613, 2022 04 04.
Artículo en Inglés | MEDLINE | ID: mdl-35379847

RESUMEN

It has been shown that the acoustical signal of posed laughter can convey affective information to the listener. However, because posed and spontaneous laughter differ in a number of significant aspects, it is unclear whether affective communication generalises to spontaneous laughter. To answer this question, we created a stimulus set of 381 spontaneous laughter audio recordings, produced by 51 different speakers, resembling different types of laughter. In Experiment 1, 159 participants were presented with these audio recordings without any further information about the situational context of the speakers and asked to classify the laughter sounds. Results showed that joyful, tickling, and schadenfreude laughter could be classified significantly above chance level. In Experiment 2, 209 participants were presented with a subset of 121 laughter recordings correctly classified in Experiment 1 and asked to rate the laughter according to four emotional dimensions, i.e., arousal, dominance, sender's valence, and receiver-directed valence. Results showed that laughter types differed significantly in their ratings on all dimensions. Joyful laughter and tickling laughter both showed a positive sender's valence and receiver-directed valence, whereby tickling laughter had a particularly high arousal. Schadenfreude had a negative receiver-directed valence and a high dominance, thus providing empirical evidence for the existence of a dark side in spontaneous laughter. The present results suggest that with the evolution of human social communication laughter diversified from the former play signal of non-human primates to a much more fine-grained signal that can serve a multitude of social functions in order to regulate group structure and hierarchy.


Asunto(s)
Risa , Voz , Animales , Nivel de Alerta , Emociones/fisiología , Humanos , Risa/fisiología , Risa/psicología , Sensación
20.
Sci Rep ; 12(1): 7117, 2022 05 03.
Artículo en Inglés | MEDLINE | ID: mdl-35505233

RESUMEN

Human nonverbal social signals are transmitted to a large extent by vocal and facial cues. The prominent importance of these cues is reflected in specialized cerebral regions which preferentially respond to these stimuli, e.g. the temporal voice area (TVA) for human voices and the fusiform face area (FFA) for human faces. But it remained up to date unknown whether there are respective specializations during resting state, i.e. in the absence of any cues, and if so, whether these representations share neural substrates across sensory modalities. In the present study, resting state functional connectivity (RSFC) as well as voice- and face-preferential activations were analysed from functional magnetic resonance imaging (fMRI) data sets of 60 healthy individuals. Data analysis comprised seed-based analyses using the TVA and FFA as regions of interest (ROIs) as well as multi voxel pattern analyses (MVPA). Using the face- and voice-preferential responses of the FFA and TVA as regressors, we identified several correlating clusters during resting state spread across frontal, temporal, parietal and occipital regions. Using these regions as seeds, characteristic and distinct network patterns were apparent with a predominantly convergent pattern for the bilateral TVAs whereas a largely divergent pattern was observed for the bilateral FFAs. One region in the anterior medial frontal cortex displayed a maximum of supramodal convergence of informative connectivity patterns reflecting voice- and face-preferential responses of both TVAs and the right FFA, pointing to shared neural resources in supramodal voice and face processing. The association of individual voice- and face-preferential neural activity with resting state connectivity patterns may support the perspective of a network function of the brain beyond an activation of specialized regions.


Asunto(s)
Reconocimiento Facial , Voz , Encéfalo/fisiología , Mapeo Encefálico , Reconocimiento Facial/fisiología , Humanos , Imagen por Resonancia Magnética
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA