Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 45
Filter
1.
Mol Psychiatry ; 28(3): 1079-1089, 2023 03.
Article in English | MEDLINE | ID: mdl-36653677

ABSTRACT

There is limited convergence in neuroimaging investigations into volumes of subcortical brain regions in social anxiety disorder (SAD). The inconsistent findings may arise from variations in methodological approaches across studies, including sample selection based on age and clinical characteristics. The ENIGMA-Anxiety Working Group initiated a global mega-analysis to determine whether differences in subcortical volumes can be detected in adults and adolescents with SAD relative to healthy controls. Volumetric data from 37 international samples with 1115 SAD patients and 2775 controls were obtained from ENIGMA-standardized protocols for image segmentation and quality assurance. Linear mixed-effects analyses were adjusted for comparisons across seven subcortical regions in each hemisphere using family-wise error (FWE)-correction. Mixed-effects d effect sizes were calculated. In the full sample, SAD patients showed smaller bilateral putamen volume than controls (left: d = -0.077, pFWE = 0.037; right: d = -0.104, pFWE = 0.001), and a significant interaction between SAD and age was found for the left putamen (r = -0.034, pFWE = 0.045). Smaller bilateral putamen volumes (left: d = -0.141, pFWE < 0.001; right: d = -0.158, pFWE < 0.001) and larger bilateral pallidum volumes (left: d = 0.129, pFWE = 0.006; right: d = 0.099, pFWE = 0.046) were detected in adult SAD patients relative to controls, but no volumetric differences were apparent in adolescent SAD patients relative to controls. Comorbid anxiety disorders and age of SAD onset were additional determinants of SAD-related volumetric differences in subcortical regions. To conclude, subtle volumetric alterations in subcortical regions in SAD were detected. Heterogeneity in age and clinical characteristics may partly explain inconsistencies in previous findings. The association between alterations in subcortical volumes and SAD illness progression deserves further investigation, especially from adolescence into adulthood.


Subject(s)
Phobia, Social , Adult , Adolescent , Humans , Magnetic Resonance Imaging/methods , Brain , Anxiety , Neuroimaging/methods
2.
J Neural Transm (Vienna) ; 130(4): 585-596, 2023 04.
Article in English | MEDLINE | ID: mdl-36808307

ABSTRACT

Laughter plays an important role in group formation, signaling social belongingness by indicating a positive or negative social intention towards the receiver. In adults without autism, the intention of laughter can be correctly differentiated without further contextual information. In autism spectrum disorder (ASD), however, differences in the perception and interpretation of social cues represent a key characteristic of the disorder. Studies suggest that these differences are associated with hypoactivation and altered connectivity among key nodes of the social perception network. How laughter, as a multimodal nonverbal social cue, is perceived and processed neurobiologically in association with autistic traits has not been assessed previously. We investigated differences in social intention attribution, neurobiological activation, and connectivity during audiovisual laughter perception in association with the degree of autistic traits in adults [N = 31, Mage (SD) = 30.7 (10.0) years, nfemale = 14]. An attenuated tendency to attribute positive social intention to laughter was found with increasing autistic traits. Neurobiologically, autistic trait scores were associated with decreased activation in the right inferior frontal cortex during laughter perception and with attenuated connectivity between the bilateral fusiform face area with bilateral inferior and lateral frontal, superior temporal, mid-cingulate and inferior parietal cortices. Results support hypoactivity and hypoconnectivity during social cue processing with increasing ASD symptoms between socioemotional face processing nodes and higher-order multimodal processing regions related to emotion identification and attribution of social intention. Furthermore, results reflect the importance of specifically including signals of positive social intention in future studies in ASD.


Subject(s)
Autism Spectrum Disorder , Autistic Disorder , Laughter , Adult , Humans , Female , Brain Mapping/methods , Intention , Magnetic Resonance Imaging/methods , Social Perception
3.
J Psychiatry Neurosci ; 46(6): E663-E674, 2021.
Article in English | MEDLINE | ID: mdl-34916236

ABSTRACT

BACKGROUND: Social anxiety disorder is characterized by intense fear and avoidance of social interactions and scrutiny by others. Although alterations in attentional control seem to play a central role in the psychopathology of social anxiety disorder, the neural underpinnings in prefrontal brain regions have not yet been fully clarified. METHODS: The present study used functional MRI in participants (age 18-50 yr) with social anxiety disorder (n = 42, 31 female) and without (n = 58, 33 female). It investigated the interrelation of the effects of social anxiety disorder and early-life adversity (a main environmental risk factor of social anxiety disorder) on brain activity during an attentional control task. We applied DNA methylation analysis to determine whether epigenetic modulation in the gene encoding the glucocorticoid receptor, NR3C1, might play a mediating role in this process. RESULTS: We identified 2 brain regions in the left and medial prefrontal cortex that exhibited an interaction effect of social anxiety disorder and early-life adversity. In participants with low levels of early-life adversity, neural activity in response to disorder-related stimuli was increased in association with social anxiety disorder. In participants with high levels of early-life adversity, neural activity was increased only in participants without social anxiety disorder. NR3C1 DNA methylation partly mediated the effect of social anxiety disorder on brain activity as a function of early-life adversity. LIMITATIONS: The absence of behavioural correlates associated with social anxiety disorder limited functional interpretation of the results. CONCLUSION: These findings demonstrate that the neurobiological processes that underlie social anxiety disorder might be fundamentally different depending on experiences of early-life adversity. Long-lasting effects of early-life adversity might be encoded in NR3C1 DNA methylation and entail alterations in social anxiety disorder-related activity patterns in the neural network of attentional control.


Subject(s)
Adverse Childhood Experiences , Phobia, Social , Adolescent , Adult , Anxiety , Brain/diagnostic imaging , DNA Methylation , Female , Humans , Male , Middle Aged , Phobia, Social/diagnostic imaging , Young Adult
4.
Hum Brain Mapp ; 41(2): 353-361, 2020 02 01.
Article in English | MEDLINE | ID: mdl-31642167

ABSTRACT

Laughter is a multifaceted signal, which can convey social acceptance facilitating social bonding as well as social rejection inflicting social pain. In the current study, we addressed the neural correlates of social intent attribution to auditory or visual laughter within an fMRI study to identify brain areas showing linear increases of activation with social intent ratings. Negative social intent attributions were associated with activation increases within the medial prefrontal cortex/anterior cingulate cortex (mPFC/ACC). Interestingly, negative social intent attributions of auditory laughter were represented more rostral than visual laughter within this area. Our findings corroborate the role of the mPFC/ACC as key node for processing "social pain" with distinct modality-specific subregions. Other brain areas that showed an increase of activation included bilateral inferior frontal gyrus and right superior/middle temporal gyrus (STG/MTG) for visually presented laughter and bilateral STG for auditory presented laughter with no overlap across modalities. Similarly, positive social intent attributions were linked to hemodynamic responses within the right inferior parietal lobe and right middle frontal gyrus, but there was no overlap of activity for visual and auditory laughter. Our findings demonstrate that social intent attribution to auditory and visual laughter is located in neighboring, but spatially distinct neural structures.


Subject(s)
Auditory Perception/physiology , Brain Mapping , Gyrus Cinguli/physiology , Laughter , Prefrontal Cortex/physiopathology , Social Perception , Temporal Lobe/physiology , Theory of Mind/physiology , Visual Perception/physiology , Adult , Female , Gyrus Cinguli/diagnostic imaging , Humans , Intention , Magnetic Resonance Imaging , Male , Middle Aged , Prefrontal Cortex/diagnostic imaging , Temporal Lobe/diagnostic imaging , Young Adult
5.
Neuroimage ; 197: 450-456, 2019 08 15.
Article in English | MEDLINE | ID: mdl-31075391

ABSTRACT

Voices and faces are the most common sources of threat in social anxiety (SA) where the fear of negative evaluation and social exclusion is the central element. SA itself is spectrally distributed among the general population and its clinical manifestation, termed social anxiety disorder, is one of the most common anxiety disorders. While heightened cerebral responses to angry or contemptuous facial or vocal expressions are well documented, it remains unclear if the brain of socially anxious individuals is generally more sensitive to voices and faces. Using functional magnetic resonance imaging, we investigated how SA affects the cerebral processing of voices and faces as compared to various other stimulus types in a study population with greatly varying SA (N = 50, 26 female). While cerebral voice-sensitivity correlated positively with SA in the left temporal voice area (TVA) and the left amygdala, an association of face-sensitivity and SA was observed in the right fusiform face area (FFA) and the face processing area of the right posterior superior temporal sulcus (pSTSFA). These results demonstrate that the increase of cerebral responses associated with social anxiety is not limited to facial or vocal expressions of social threat but that the respective sensory and emotion processing structures are also generally tuned to voices and faces.


Subject(s)
Anxiety Disorders/physiopathology , Anxiety/physiopathology , Auditory Perception/physiology , Brain/physiopathology , Visual Perception/physiology , Adult , Facial Expression , Female , Humans , Magnetic Resonance Imaging , Male , Voice , Young Adult
6.
J Neural Transm (Vienna) ; 126(9): 1175-1185, 2019 09.
Article in English | MEDLINE | ID: mdl-30498952

ABSTRACT

Attention biases towards threat signals have been linked to the etiology and symptomatology of social anxiety disorder (SAD). Dysfunction of the dorsolateral prefrontal cortex (dlPFC) may contribute to attention biases in anxious individuals. The aim of this study was to investigate the feasibility of near-infrared spectroscopy (NIRS) neurofeedback (NF) training-targeting the dlPFC-and its effects on threat-related attention biases of individuals with SAD. 12 individuals with SAD participated in the NIRS-NF training lasting 6-8 weeks and including a total of 15 sessions. NF performance increased significantly, while the attention bias towards threat-related stimuli and SAD symptom severity decreased after the training. The individual increase in neurofeedback performance as well as the individual decrease in SAD symptom severity was correlated with decreased responses to social threat signals in the cerebral attention system. Thus, this pilot study does not only demonstrate that NIRS-based NF is feasible in SAD patients, but also may be a promising method to investigate the causal role of the dlPFC in attention biases in SAD. Its effectiveness as a treatment tool might be examined in future studies.


Subject(s)
Attentional Bias , Facial Recognition , Fear , Neurofeedback/methods , Phobia, Social/therapy , Prefrontal Cortex , Social Perception , Spectroscopy, Near-Infrared , Adult , Attentional Bias/physiology , Facial Recognition/physiology , Fear/physiology , Feasibility Studies , Female , Humans , Male , Phobia, Social/physiopathology , Pilot Projects , Prefrontal Cortex/physiopathology , Treatment Outcome , Young Adult
7.
J Neural Transm (Vienna) ; 123(8): 937-47, 2016 08.
Article in English | MEDLINE | ID: mdl-27094176

ABSTRACT

People diagnosed with autism spectrum disorder (ASD) characteristically present with severe difficulties in interpreting every-day social signals. Currently it is assumed that these difficulties might have neurobiological correlates in alterations in activation as well as in connectivity in and between regions of the social perception network suggested to govern the processing of social cues. In this study, we conducted functional magnetic resonance imaging (fMRI)-based activation and connectivity analyses focusing on face-, voice-, and audiovisual-processing brain regions as the most important subareas of the social perception network. Results revealed alterations in connectivity among regions involved in the processing of social stimuli in ASD subjects compared to typically developed (TD) controls-specifically, a reduced connectivity between the left temporal voice area (TVA) and the superior and medial frontal gyrus. Alterations in connectivity, moreover, were correlated with the severity of autistic traits: correlation analysis indicated that the connectivity between the left TVA and the limbic lobe, anterior cingulate and the medial frontal gyrus as well as between the right TVA and the frontal lobe, anterior cingulate, limbic lobe and the caudate decreased with increasing symptom severity. As these frontal regions are understood to play an important role in interpreting and mentalizing social signals, the observed underconnectivity might be construed as playing a role in social impairments in ASD.


Subject(s)
Autism Spectrum Disorder/pathology , Autism Spectrum Disorder/psychology , Brain Mapping , Cues , Frontal Lobe/physiopathology , Neural Pathways/physiology , Social Perception , Adult , Autism Spectrum Disorder/diagnostic imaging , Brain Mapping/methods , Facial Expression , Female , Frontal Lobe/diagnostic imaging , Head Movements , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Neural Pathways/diagnostic imaging , Oxygen/blood , Physical Stimulation , Young Adult
8.
J Neural Transm (Vienna) ; 123(8): 961-70, 2016 08.
Article in English | MEDLINE | ID: mdl-26850439

ABSTRACT

This study examined identification of emotional information in facial expression, prosody, and their combination in 23 adult patients with combined attention deficit-hyperactivity disorder (ADHD) versus 31 healthy controls (HC) matched for gender, age, and education. We employed a stimulus set which was carefully balanced for valence as well as recognizability of the expressed emotions as determined in an independent sample of HC to avoid potential biases due to different levels of task difficulty. ADHD patients were characterized by impaired recognition of all employed categories (neutral, happiness, eroticism, disgust, anger). Basic cognitive functions as assessed by neuropsychological testing, such as sustained attention, constancy of alertness, and verbal intelligence partially explained lower recognition rates. Removal of the correlated variance by means of regression analyses did not abolish lower performance in ADHD indicating deficits in social cognition independent of these neuropsychological factors (p < 0.05). Lower performance correlated with self-rated emotional intelligence (r = 0.38, p < 0.05) indicating that adults with ADHD are aware of their problems in emotion perception. ADHD patients could partly compensate their deficit in unimodal emotion perception by audiovisual integration as revealed by larger gains in emotion recognition accuracy during bimodal presentation (p < 0.05) as compared to HC. These behavioral results can serve as foundation for future neuroimaging studies and point rather towards sensory-specific regions than audiovisual integration areas in perception of emotional information in adult ADHD.


Subject(s)
Attention Deficit Disorder with Hyperactivity/physiopathology , Attention Deficit Disorder with Hyperactivity/psychology , Emotions/physiology , Facial Expression , Adolescent , Adult , Case-Control Studies , Female , Humans , Male , Photic Stimulation , Psychometrics , Self Report , Social Behavior , Verbal Behavior/physiology , Young Adult
9.
Cogn Emot ; 28(3): 452-69, 2014 Apr.
Article in English | MEDLINE | ID: mdl-24151963

ABSTRACT

Results from studies on gender differences in emotion recognition vary, depending on the types of emotion and the sensory modalities used for stimulus presentation. This makes comparability between different studies problematic. This study investigated emotion recognition of healthy participants (N = 84; 40 males; ages 20 to 70 years), using dynamic stimuli, displayed by two genders in three different sensory modalities (auditory, visual, audio-visual) and five emotional categories. The participants were asked to categorise the stimuli on the basis of their nonverbal emotional content (happy, alluring, neutral, angry, and disgusted). Hit rates and category selection biases were analysed. Women were found to be more accurate in recognition of emotional prosody. This effect was partially mediated by hearing loss for the frequency of 8,000 Hz. Moreover, there was a gender-specific selection bias for alluring stimuli: Men, as compared to women, chose "alluring" more often when a stimulus was presented by a woman as compared to a man.


Subject(s)
Auditory Perception , Emotions , Recognition, Psychology , Sex Characteristics , Visual Perception , Adult , Affect , Aged , Arousal , Attention , Female , Humans , Male , Memory, Short-Term , Middle Aged , Young Adult
10.
Neuroimage ; 76: 45-56, 2013 Aug 01.
Article in English | MEDLINE | ID: mdl-23507387

ABSTRACT

It was the aim of this study to delineate the areas along the right superior temporal sulcus (STS) for processing of faces, voices, and face-voice integration using established functional magnetic resonance imaging (fMRI) localizers and to assess their structural connectivity profile with diffusion tensor imaging (DTI). We combined this approach with an fMRI adaptation design during which the participants judged emotions in facial expressions and prosody and demonstrated response habituation in the orbitofrontal cortex (OFC) which occurred irrespective of the sensory modality. These functional data were in line with DTI findings showing separable fiber projections of the three different STS modules converging in the OFC which run through the external capsule for the voice area, through the dorsal superior longitudinal fasciculus (SLF) for the face area and through the ventral SLF for the audiovisual integration area. The OFC was structurally connected with the supplementary motor area (SMA) and activation in these two areas was correlated with faster stimulus evaluation during repetition priming. Based on these structural and functional properties, we propose that the OFC is part of the extended system for perception of emotional information in faces and voices and constitutes a neural interface linking sensory areas with brain regions implicated in generation of behavioral responses.


Subject(s)
Auditory Perception/physiology , Brain Mapping , Cerebral Cortex/physiology , Neural Pathways/physiology , Pattern Recognition, Visual/physiology , Diffusion Magnetic Resonance Imaging , Face , Female , Humans , Image Interpretation, Computer-Assisted , Magnetic Resonance Imaging , Male , Voice , Young Adult
11.
Cereb Cortex ; 22(1): 191-200, 2012 Jan.
Article in English | MEDLINE | ID: mdl-21625012

ABSTRACT

We determined the location, functional response profile, and structural fiber connections of auditory areas with voice- and emotion-sensitive activity using functional magnetic resonance imaging (fMRI) and diffusion tensor imaging. Bilateral regions responding to emotional voices were consistently found in the superior temporal gyrus, posterolateral to the primary auditory cortex. Event-related fMRI showed stronger responses in these areas to voices-expressing anger, sadness, joy, and relief, relative to voices with neutral prosody. Their neural responses were primarily driven by prosodic arousal, irrespective of valence. Probabilistic fiber tracking revealed direct structural connections of these "emotional voice areas" (EVA) with ipsilateral medial geniculate body, which is the major input source of early auditory cortex, as well as with the ipsilateral inferior frontal gyrus (IFG) and inferior parietal lobe (IPL). In addition, vocal emotions (compared with neutral prosody) increased the functional coupling of EVA with the ipsilateral IFG but not IPL. These results provide new insights into the neural architecture of the human voice processing system and support a crucial involvement of IFG in the recognition of vocal emotions, whereas IPL may subserve distinct auditory spatial functions, consistent with distinct anatomical substrates for the processing of "how" and "where" information within the auditory pathways.


Subject(s)
Auditory Pathways/blood supply , Brain Mapping , Brain/blood supply , Brain/physiology , Emotions/physiology , Voice/physiology , Acoustic Stimulation , Adult , Analysis of Variance , Arousal , Auditory Pathways/physiology , Auditory Perception/physiology , Diffusion Tensor Imaging , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Nerve Fibers/physiology , Oxygen/blood , Young Adult
12.
Cogn Emot ; 27(5): 783-99, 2013.
Article in English | MEDLINE | ID: mdl-23134564

ABSTRACT

Emotional communication uses verbal and nonverbal means. In case of conflicting signals, nonverbal information is assumed to have a stronger impact. It is unclear, however, whether perceptual nonverbal dominance varies between individuals and whether it is linked to emotional intelligence. Using audiovisual stimulus material comprising verbal and nonverbal emotional cues that were varied independently, perceptual nonverbal dominance profiles and their relations to emotional intelligence were examined. Nonverbal dominance was found in every participant, ranging from 55 to 100%. Moreover, emotional intelligence, particularly the ability to understand emotions, correlated positively with nonverbal dominance. Furthermore, higher overall emotional intelligence as well as a higher ability to understand emotions were linked to smaller reaction time differences between emotionally incongruent and congruent stimuli. The association between perceptual nonverbal dominance and emotional intelligence, and more specifically the ability to understand emotions, might reflect an adaptive process driven by the experience of higher authenticity in nonverbal cues.


Subject(s)
Emotional Intelligence , Nonverbal Communication/psychology , Social Perception , Adult , Cues , Facial Expression , Female , Humans , Male , Reaction Time
13.
Front Psychol ; 14: 1213792, 2023.
Article in English | MEDLINE | ID: mdl-37637902

ABSTRACT

A number of case studies describing hypnotherapy in the treatment of anxiety disorder patients have already been published. Only a few randomized controlled trials (RCTs) investigated the efficacy of hypnotherapy but focused mainly on symptoms rather than specific mental disorders. The goal of this study was to investigate whether hypnotherapy (HT) was superior to a waitlist control group (WL) in the reduction of agoraphobia-related symptoms. Further goals were to report the feasibility of hypnotherapy as well as attrition and completion rates and detect (epi-)genetic variables, which might play a role in treatment outcome. This pilot study was based on a monocentric two-armed randomized controlled rater-blind clinical trial that was conducted between 2018 and 2020 with a waitlist control group. A total of 36 patients diagnosed with agoraphobia were randomized to either HT or WL. Patients in HT received individual outpatient treatment with hypnotherapy with 8 to 12 sessions for a period of 3 months. Patients in WL received HT after 3 months. Agoraphobia-related symptoms were assessed at baseline, after the treatment, and 3 months later in both groups with a clinician rating. The primary hypothesis concerning the difference between groups in the individual percentage symptom reduction could be confirmed in the intention-to-treat, not the per-protocol sample. Additionally, we applied repeated-measures analyses of variance and found a higher symptom decrease in HT compared with WL patients in three of the five imputed datasets. The dropout rate was low, and satisfaction with the treatment was high. HT patients experienced a strong symptom reduction after receiving hypnotherapy. WL patients improved slightly during the waiting period. The COMT Val108/158Met genotype had an effect on the agoraphobia-related symptoms as well as on COMT DNA methylation levels. This is the first study to indicate that hypnotherapy performed better than a waitlist control group regarding the reduction in anxiety symptoms in an RCT. Future studies should confirm the efficacy of hypnotherapy and compare the treatment with a standard treatment for anxiety disorders in a larger trial. Future studies should also investigate whether hypnotic susceptibility is associated with COMT Val108/158Met genotype and could predict treatment success for HT. Clinical trial registration: https://classic.clinicaltrials.gov/ct2/show/NCT03684577, identifier: NCT03684577.

14.
Front Psychiatry ; 14: 1125553, 2023.
Article in English | MEDLINE | ID: mdl-37181876

ABSTRACT

Social anxiety disorder (SAD) is a psychiatric disorder characterized by severe fear in social situations and avoidance of these. Multiple genetic as well as environmental factors contribute to the etiopathology of SAD. One of the main risk factors for SAD is stress, especially during early periods of life (early life adversity; ELA). ELA leads to structural and regulatory alterations contributing to disease vulnerability. This includes the dysregulation of the immune response. However, the molecular link between ELA and the risk for SAD in adulthood remains largely unclear. Evidence is emerging that long-lasting changes of gene expression patterns play an important role in the biological mechanisms linking ELA and SAD. Therefore, we conducted a transcriptome study of SAD and ELA performing RNA sequencing in peripheral blood samples. Analyzing differential gene expression between individuals suffering from SAD with high or low levels of ELA and healthy individuals with high or low levels of ELA, 13 significantly differentially expressed genes (DEGs) were identified with respect to SAD while no significant differences in expression were identified with respect to ELA. The most significantly expressed gene was MAPK3 (p = 0.003) being upregulated in the SAD group compared to control individuals. In contrary, weighted gene co-expression network analysis (WGCNA) identified only modules significantly associated with ELA (p ≤ 0.05), not with SAD. Furthermore, analyzing interaction networks of the genes from the ELA-associated modules and the SAD-related MAPK3 revealed complex interactions of those genes. Gene functional enrichment analyses indicate a role of signal transduction pathways as well as inflammatory responses supporting an involvement of the immune system in the association of ELA and SAD. In conclusion, we did not identify a direct molecular link between ELA and adult SAD by transcriptional changes. However, our data indicate an indirect association of ELA and SAD mediated by the interaction of genes involved in immune-related signal transduction.

15.
Neuroimage ; 61(3): 738-47, 2012 Jul 02.
Article in English | MEDLINE | ID: mdl-22516367

ABSTRACT

Emotional communication is essential for successful social interactions. Emotional information can be expressed at verbal and nonverbal levels. If the verbal message contradicts the nonverbal expression, usually the nonverbal information is perceived as being more authentic, revealing the "true feelings" of the speaker. The present fMRI study investigated the cerebral integration of verbal (sentences expressing the emotional state of the speaker) and nonverbal (facial expressions and tone of voice) emotional signals using ecologically valid audiovisual stimulus material. More specifically, cerebral activation associated with the relative impact of nonverbal information on judging the affective state of a speaker (individual nonverbal dominance index, INDI) was investigated. Perception of nonverbally expressed emotions was associated with bilateral activation within the amygdala, fusiform face area (FFA), temporal voice area (TVA), and the posterior temporal cortex as well as in the midbrain and left inferior orbitofrontal cortex (OFC)/left insula. Verbally conveyed emotions were linked to increased responses bilaterally in the TVA. Furthermore, the INDI correlated with responses in the left amygdala elicited by nonverbal and verbal emotional stimuli. Correlation of the INDI with the activation within the medial OFC was observed during the processing of communicative signals. These results suggest that individuals with a higher degree of nonverbal dominance have an increased sensitivity not only to nonverbal but to emotional stimuli in general.


Subject(s)
Brain/physiology , Cues , Dominance, Cerebral/physiology , Emotions/physiology , Acoustic Stimulation , Adult , Amygdala/physiology , Brain Mapping , Cerebrovascular Circulation/physiology , Communication , Expressed Emotion , Facial Expression , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Photic Stimulation , Prefrontal Cortex/physiology , Speech , Temporal Lobe/physiology , Voice/physiology , Young Adult
16.
Article in English | MEDLINE | ID: mdl-33551283

ABSTRACT

BACKGROUND: Deficits in emotion recognition have been repeatedly documented in patients diagnosed with attention-deficit/hyperactivity disorder (ADHD), but their neural basis is unknown so far. METHODS: In the current study, adult patients with ADHD (n = 44) and healthy control subjects (n = 43) underwent functional magnetic resonance imaging during explicit emotion recognition of stimuli expressing affective information in face, voice, or face-voice combinations. The employed experimental paradigm allowed us to delineate areas for processing audiovisual information based on their functional activation profile, including the bilateral posterior superior temporal gyrus/middle temporal gyrus, amygdala, medial prefrontal cortex, and precuneus, as well as the right posterior thalamus. RESULTS: As expected, unbiased hit rates for correct classification of the expressed emotions were lower in patients with ADHD than in healthy control subjects irrespective of the presented sensory modality. This deficit at a behavioral level was accompanied by lower activation in patients with ADHD versus healthy control subjects in the cortex adjacent to the right superior temporal gyrus/middle temporal gyrus and the right posterior thalamus, which represent key areas for processing socially relevant signals and their integration across modalities. A cortical region adjacent to the right posterior superior temporal gyrus was the only brain region that showed a significant correlation between brain activation and emotion identification performance. CONCLUSIONS: Altogether, these results provide the first evidence for a potential neural substrate of the observed impairments in emotion recognition in adults with ADHD.


Subject(s)
Attention Deficit Disorder with Hyperactivity , Adult , Brain , Brain Mapping , Emotions/physiology , Humans , Magnetic Resonance Imaging
17.
Sci Rep ; 12(1): 7117, 2022 05 03.
Article in English | MEDLINE | ID: mdl-35505233

ABSTRACT

Human nonverbal social signals are transmitted to a large extent by vocal and facial cues. The prominent importance of these cues is reflected in specialized cerebral regions which preferentially respond to these stimuli, e.g. the temporal voice area (TVA) for human voices and the fusiform face area (FFA) for human faces. But it remained up to date unknown whether there are respective specializations during resting state, i.e. in the absence of any cues, and if so, whether these representations share neural substrates across sensory modalities. In the present study, resting state functional connectivity (RSFC) as well as voice- and face-preferential activations were analysed from functional magnetic resonance imaging (fMRI) data sets of 60 healthy individuals. Data analysis comprised seed-based analyses using the TVA and FFA as regions of interest (ROIs) as well as multi voxel pattern analyses (MVPA). Using the face- and voice-preferential responses of the FFA and TVA as regressors, we identified several correlating clusters during resting state spread across frontal, temporal, parietal and occipital regions. Using these regions as seeds, characteristic and distinct network patterns were apparent with a predominantly convergent pattern for the bilateral TVAs whereas a largely divergent pattern was observed for the bilateral FFAs. One region in the anterior medial frontal cortex displayed a maximum of supramodal convergence of informative connectivity patterns reflecting voice- and face-preferential responses of both TVAs and the right FFA, pointing to shared neural resources in supramodal voice and face processing. The association of individual voice- and face-preferential neural activity with resting state connectivity patterns may support the perspective of a network function of the brain beyond an activation of specialized regions.


Subject(s)
Facial Recognition , Voice , Brain/physiology , Brain Mapping , Facial Recognition/physiology , Humans , Magnetic Resonance Imaging
18.
Neuroimage ; 58(1): 259-68, 2011 Sep 01.
Article in English | MEDLINE | ID: mdl-21689767

ABSTRACT

While several studies have focused on identifying common brain mechanisms governing the decoding of emotional speech melody, interindividual variations in the cerebral processing of prosodic information, in comparison, have received only little attention to date: Albeit, for instance, differences in personality among individuals have been shown to modulate emotional brain responses, personality influences on the neural basis of prosody decoding have not been investigated systematically yet. Thus, the present study aimed at delineating relationships between interindividual differences in personality and hemodynamic responses evoked by emotional speech melody. To determine personality-dependent modulations of brain reactivity, fMRI activation patterns during the processing of emotional speech cues were acquired from 24 healthy volunteers and subsequently correlated with individual trait measures of extraversion and neuroticism obtained for each participant. Whereas correlation analysis did not indicate any link between brain activation and extraversion, strong positive correlations between measures of neuroticism and hemodynamic responses of the right amygdala, the left postcentral gyrus as well as medial frontal structures including the right anterior cingulate cortex emerged, suggesting that brain mechanisms mediating the decoding of emotional speech melody may vary depending on differences in neuroticism among individuals. Observed trait-specific modulations are discussed in the light of processing biases as well as differences in emotion control or task strategies which may be associated with the personality trait of neuroticism.


Subject(s)
Brain/physiology , Emotions/physiology , Personality/physiology , Adult , Attention/physiology , Cerebrovascular Circulation/physiology , Cues , Extraversion, Psychological , Female , Humans , Image Processing, Computer-Assisted , Individuality , Magnetic Resonance Imaging , Male , Neurotic Disorders/psychology , Personality Tests , Psychomotor Performance/physiology , Reaction Time/physiology , Semantics , Speech/physiology , Young Adult
19.
Transl Psychiatry ; 11(1): 104, 2021 02 04.
Article in English | MEDLINE | ID: mdl-33542190

ABSTRACT

Social anxiety disorder (SAD) is a psychiatric disorder characterized by extensive fear in social situations. Multiple genetic and environmental factors are known to contribute to its pathogenesis. One of the main environmental risk factors is early life adversity (ELA). Evidence is emerging that epigenetic mechanisms such as DNA methylation might play an important role in the biological mechanisms underlying SAD and ELA. To investigate the relationship between ELA, DNA methylation, and SAD, we performed an epigenome-wide association study for SAD and ELA examining DNA from whole blood of a cohort of 143 individuals using DNA methylation arrays. We identified two differentially methylated regions (DMRs) associated with SAD located within the genes SLC43A2 and TNXB. As this was the first epigenome-wide association study for SAD, it is worth noting that both genes have previously been associated with panic disorder. Further, we identified two DMRs associated with ELA within the SLC17A3 promoter region and the SIAH3 gene and several DMRs that were associated with the interaction of SAD and ELA. Of these, the regions within C2CD2L and MRPL28 showed the largest difference in DNA methylation. Lastly, we found that two DMRs were associated with both the severity of social anxiety and ELA, however, neither of them was found to mediate the contribution of ELA to SAD later in life. Future studies are needed to replicate our findings in independent cohorts and to investigate the biological pathways underlying these effects.


Subject(s)
Adverse Childhood Experiences , Phobia, Social , DNA Methylation , Epigenesis, Genetic , Epigenome , Humans , Phobia, Social/genetics
20.
Neuroimage ; 53(4): 1264-71, 2010 Dec.
Article in English | MEDLINE | ID: mdl-20600991

ABSTRACT

Laughter is highly relevant for social interaction in human beings and non-human primates. In humans as well as in non-human primates laughter can be induced by tickling. Human laughter, however, has further diversified and encompasses emotional laughter types with various communicative functions, e.g. joyful and taunting laughter. Here, it was evaluated if this evolutionary diversification of ecological functions is associated with distinct cerebral responses underlying laughter perception. Functional MRI revealed a double-dissociation of cerebral responses during perception of tickling laughter and emotional laughter (joy and taunt) with higher activations in the anterior rostral medial frontal cortex (arMFC) when emotional laughter was perceived, and stronger responses in the right superior temporal gyrus (STG) during appreciation of tickling laughter. Enhanced activation of the arMFC for emotional laughter presumably reflects increasing demands on social cognition processes arising from the greater social salience of these laughter types. Activation increase in the STG for tickling laughter may be linked to the higher acoustic complexity of this laughter type. The observed dissociation of cerebral responses for emotional laughter and tickling laughter was independent of task-directed focusing of attention. These findings support the postulated diversification of human laughter in the course of evolution from an unequivocal play signal to laughter with distinct emotional contents subserving complex social functions.


Subject(s)
Auditory Perception/physiology , Brain Mapping , Brain/physiology , Laughter/physiology , Adult , Brain/anatomy & histology , Emotions/physiology , Female , Humans , Image Interpretation, Computer-Assisted , Magnetic Resonance Imaging , Male
SELECTION OF CITATIONS
SEARCH DETAIL