Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 17 de 17
Filter
1.
Neuroimage Clin ; 27: 102301, 2020.
Article in English | MEDLINE | ID: mdl-32604020

ABSTRACT

Neurofeedback (NF), a training tool aimed at enhancing neural self-regulation, has been suggested as a complementary treatment option for neuropsychiatric disorders. Despite its potential as a neurobiological intervention directly targeting neural alterations underlying clinical symptoms, the efficacy of NF for the treatment of mental disorders has been questioned recently by negative findings obtained in randomized controlled trials (e.g., Cortese et al., 2016). A possible reason for insufficient group effects of NF trainings vs. placebo could be related to the high rate of participants who fail to self-regulate brain activity by NF ("non-learners"). Another reason could be the application of standardized NF protocols not adjusted to individual differences in pathophysiology. Against this background, we have summarized information on factors determining training and treatment success to provide a basis for the development of individualized training protocols and/or clinical indications. The present systematic review included 25 reports investigating predictors for the outcome of NF trainings in healthy individuals as well as patients affected by mental disorders or epilepsy. We selected these studies based on searches in EBSCOhost using combinations of the keywords "neurofeedback" and "predictor/predictors". As "NF training" we defined all NF applications with at least two sessions. The best available evidence exists for neurophysiological baseline parameters. Among them, the target parameters of the respective training seem to be of particular importance. However, particularities of the different experimental designs and outcome criteria restrict the interpretability of some of the information we extracted. Therefore, further research is needed to gain more profound knowledge about predictors of NF outcome.


Subject(s)
Brain-Computer Interfaces , Brain/pathology , Mental Disorders/physiopathology , Neurofeedback , Brain/physiology , Electroencephalography/methods , Humans , Learning/physiology , Neurofeedback/methods
2.
Neuropsychologia ; 96: 175-183, 2017 02.
Article in English | MEDLINE | ID: mdl-28095313

ABSTRACT

Social anxiety disorder (SAD) is characterized by negatively biased perception of social cues and deficits in emotion regulation. While negatively biased perception is thought to maintain social anxiety, emotion regulation represents an ability necessary to overcome both biased perception and social anxiety. Here, we used laughter as a social threat in a functional magnetic resonance imaging (fMRI) study to identify cerebral mediators linking SAD with attention and interpretation biases and their modification through cognitive emotion regulation in the form of reappraisal. We found that reappraisal abolished the negative laughter interpretation bias in SAD and that this process was directly mediated through activation patterns of the left dorsolateral prefrontal cortex (DLPFC) serving as a cerebral pivot between biased social perception and its normalization through reappraisal. Connectivity analyses revealed reduced prefrontal control over threat-processing sensory cortices (here: the temporal voice area) during cognitive emotion regulation in SAD. Our results indicate a central role for the left DLPFC in SAD which might represent a valuable target for future research on interventions either aiming to directly modulate cognitive emotion regulation in SAD or to evaluate its potential as physiological marker for psychotherapeutic interventions relying on emotion regulation.


Subject(s)
Laughter , Phobia, Social/pathology , Phobia, Social/psychology , Prefrontal Cortex/physiopathology , Social Perception , Acoustic Stimulation , Adult , Analysis of Variance , Bias , Cues , Female , Functional Laterality , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Oxygen/blood , Phobia, Social/diagnostic imaging , Prefrontal Cortex/diagnostic imaging , Psychiatric Status Rating Scales , Psychometrics , Reaction Time/physiology , Young Adult
3.
J Neural Eng ; 13(6): 066021, 2016 12.
Article in English | MEDLINE | ID: mdl-27841159

ABSTRACT

OBJECTIVE: Electroencephalographic (EEG) brain-computer interfaces (BCIs) hold promise in restoring communication for patients with completely locked-in stage amyotrophic lateral sclerosis (ALS). However, these patients cannot use existing EEG-based BCIs, arguably because such systems rely on brain processes that are impaired in the late stages of ALS. In this work, we introduce a novel BCI designed for patients in late stages of ALS based on high-level cognitive processes that are less likely to be affected by ALS. APPROACH: We trained two ALS patients via EEG-based neurofeedback to use self-regulation of theta or gamma oscillations in the precuneus for basic communication. Because there is a tight connection between the precuneus and consciousness, precuneus oscillations are arguably generated by high-level cognitive processes, which are less likely to be affected by ALS than processes linked to the peripheral nervous system. MAIN RESULTS: Both patients learned to self-regulate their precuneus oscillations and achieved stable online decoding accuracy over the course of disease progression. One patient achieved a mean online decoding accuracy in a binary decision task of 70.55% across 26 training sessions, and the other patient achieved 59.44% across 16 training sessions. We provide empirical evidence that these oscillations were cortical in nature and originated from the intersection of the precuneus, cuneus, and posterior cingulate. SIGNIFICANCE: Our results establish that ALS patients can employ self-regulation of precuneus oscillations for communication. Such a BCI is likely to be available to ALS patients as long as their consciousness supports communication.


Subject(s)
Amyotrophic Lateral Sclerosis/physiopathology , Amyotrophic Lateral Sclerosis/rehabilitation , Brain-Computer Interfaces , Communication Aids for Disabled , Electroencephalography , Parietal Lobe/physiopathology , Algorithms , Artifacts , Cognition , Gamma Rhythm , Humans , Magnetic Resonance Imaging , Male , Middle Aged , Neurofeedback , Psychomotor Performance , Theta Rhythm
4.
PLoS One ; 8(5): e63441, 2013.
Article in English | MEDLINE | ID: mdl-23667619

ABSTRACT

Laughter is an ancient signal of social communication among humans and non-human primates. Laughter types with complex social functions (e.g., taunt and joy) presumably evolved from the unequivocal and reflex-like social bonding signal of tickling laughter already present in non-human primates. Here, we investigated the modulations of cerebral connectivity associated with different laughter types as well as the effects of attention shifts between implicit and explicit processing of social information conveyed by laughter using functional magnetic resonance imaging (fMRI). Complex social laughter types and tickling laughter were found to modulate connectivity in two distinguishable but partially overlapping parts of the laughter perception network irrespective of task instructions. Connectivity changes, presumably related to the higher acoustic complexity of tickling laughter, occurred between areas in the prefrontal cortex and the auditory association cortex, potentially reflecting higher demands on acoustic analysis associated with increased information load on auditory attention, working memory, evaluation and response selection processes. In contrast, the higher degree of socio-relational information in complex social laughter types was linked to increases of connectivity between auditory association cortices, the right dorsolateral prefrontal cortex and brain areas associated with mentalizing as well as areas in the visual associative cortex. These modulations might reflect automatic analysis of acoustic features, attention direction to informative aspects of the laughter signal and the retention of those in working memory during evaluation processes. These processes may be associated with visual imagery supporting the formation of inferences on the intentions of our social counterparts. Here, the right dorsolateral precentral cortex appears as a network node potentially linking the functions of auditory and visual associative sensory cortices with those of the mentalizing-associated anterior mediofrontal cortex during the decoding of social information in laughter.


Subject(s)
Auditory Cortex/metabolism , Laughter/physiology , Mental Processes/physiology , Nerve Net/physiology , Prefrontal Cortex/physiology , Social Behavior , Acoustic Stimulation , Adult , Brain Mapping , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging
5.
Brain Behav Immun ; 27(1): 33-7, 2013 Jan.
Article in English | MEDLINE | ID: mdl-23010451

ABSTRACT

Hashimoto's thyroiditis (HT) can casually co-occur with an encephalopathy associated with autoimmune thyroid disease. Recently we found an increased occurrence of weaknesses in sustained attention and response inhibition in a subgroup of euthyroid patients with HT as obtained by the d2 attention test. Previous studies in healthy subjects and patients with brain lesions demonstrated a pivotal role for the left inferior frontal gyrus (LIFG) in these skills. Therefore, we studied the association between the performance in the d2 test and grey matter (GM) density of the LIFG in 13 euthyroid patients with HT compared to a control group of 12 euthyroid patients with other thyroid diseases. A significant correlation between GM density and d2 test total score was detected for the opercular part of the LIFG in patients with HT (p<0.001), but not in the control group (p=0.94). Regression in patients with HT was significantly stronger than in the control group (p=0.02). Moreover, GM density was significantly reduced when comparing HT patients with control patients that scored in the lower third during d2 attention testing (p<0.05). It can be concluded that in HT performance in the d2 test correlated with GM density of the LIFG. Particularly low achievement was associated with reduced GM density of this brain region suggesting an influence of autoimmune processes on the frontal cortex in this disease. This could be due to not yet known antibodies affecting brain morphology or an influence of thyroid antibodies themselves.


Subject(s)
Attention , Frontal Lobe , Hashimoto Disease , Mental Disorders , Nerve Fibers, Unmyelinated/pathology , Adult , Autoantibodies/blood , Case-Control Studies , Female , Frontal Lobe/pathology , Frontal Lobe/physiopathology , Hashimoto Disease/complications , Hashimoto Disease/pathology , Hashimoto Disease/physiopathology , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Mental Disorders/complications , Mental Disorders/pathology , Mental Disorders/physiopathology , Middle Aged , Neuropsychological Tests , Thyroid Diseases/pathology , Thyroid Diseases/physiopathology , Thyroid Function Tests , Young Adult
6.
Cereb Cortex ; 22(1): 191-200, 2012 Jan.
Article in English | MEDLINE | ID: mdl-21625012

ABSTRACT

We determined the location, functional response profile, and structural fiber connections of auditory areas with voice- and emotion-sensitive activity using functional magnetic resonance imaging (fMRI) and diffusion tensor imaging. Bilateral regions responding to emotional voices were consistently found in the superior temporal gyrus, posterolateral to the primary auditory cortex. Event-related fMRI showed stronger responses in these areas to voices-expressing anger, sadness, joy, and relief, relative to voices with neutral prosody. Their neural responses were primarily driven by prosodic arousal, irrespective of valence. Probabilistic fiber tracking revealed direct structural connections of these "emotional voice areas" (EVA) with ipsilateral medial geniculate body, which is the major input source of early auditory cortex, as well as with the ipsilateral inferior frontal gyrus (IFG) and inferior parietal lobe (IPL). In addition, vocal emotions (compared with neutral prosody) increased the functional coupling of EVA with the ipsilateral IFG but not IPL. These results provide new insights into the neural architecture of the human voice processing system and support a crucial involvement of IFG in the recognition of vocal emotions, whereas IPL may subserve distinct auditory spatial functions, consistent with distinct anatomical substrates for the processing of "how" and "where" information within the auditory pathways.


Subject(s)
Auditory Pathways/blood supply , Brain Mapping , Brain/blood supply , Brain/physiology , Emotions/physiology , Voice/physiology , Acoustic Stimulation , Adult , Analysis of Variance , Arousal , Auditory Pathways/physiology , Auditory Perception/physiology , Diffusion Tensor Imaging , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Nerve Fibers/physiology , Oxygen/blood , Young Adult
7.
Hum Brain Mapp ; 31(7): 979-91, 2010 Jul.
Article in English | MEDLINE | ID: mdl-19937724

ABSTRACT

Multimodal integration of nonverbal social signals is essential for successful social interaction. Previous studies have implicated the posterior superior temporal sulcus (pSTS) in the perception of social signals such as nonverbal emotional signals as well as in social cognitive functions like mentalizing/theory of mind. In the present study, we evaluated the relationships between trait emotional intelligence (EI) and fMRI activation patterns in individual subjects during the multimodal perception of nonverbal emotional signals from voice and face. Trait EI was linked to hemodynamic responses in the right pSTS, an area which also exhibits a distinct sensitivity to human voices and faces. Within all other regions known to subserve the perceptual audiovisual integration of human social signals (i.e., amygdala, fusiform gyrus, thalamus), no such linked responses were observed. This functional difference in the network for the audiovisual perception of human social signals indicates a specific contribution of the pSTS as a possible interface between the perception of social information and social cognition.


Subject(s)
Auditory Perception/physiology , Brain/physiology , Emotional Intelligence/physiology , Emotions , Social Perception , Visual Perception/physiology , Acoustic Stimulation , Adult , Brain/blood supply , Brain Mapping , Cerebrovascular Circulation , Facial Expression , Female , Humans , Magnetic Resonance Imaging , Male , Neural Pathways/physiology , Neuropsychological Tests , Photic Stimulation , Temporal Lobe/physiology , Voice
8.
Neuroreport ; 20(15): 1356-60, 2009 Oct 07.
Article in English | MEDLINE | ID: mdl-19696688

ABSTRACT

The role of the amygdala in processing acoustic information of affective value is still under debate. Using event-related functional MRI (fMRI), we showed increased amygdalar responses to various emotions (anger, fear, happiness, eroticism) expressed by prosody, a means of communication bound to language and consequently unique to humans. The smallest signal increases were found for fearful prosody, a finding that could not be explained by rapid response habituation to stimuli of this emotional category, challenging classical theories about fear specificity of the human amygdala. Our results converge with earlier neuroimaging evidence investigating emotional vocalizations, and these neurobiological similarities suggest that the two forms of communication might have common evolutionary roots.


Subject(s)
Amygdala/physiology , Emotions/physiology , Habituation, Psychophysiologic/physiology , Language , Social Behavior , Speech Perception/physiology , Acoustic Stimulation , Adult , Animals , Brain Mapping , Female , Humans , Language Tests , Magnetic Resonance Imaging , Male , Vocalization, Animal/physiology
9.
Neuropsychologia ; 47(14): 3059-66, 2009 Dec.
Article in English | MEDLINE | ID: mdl-19596021

ABSTRACT

Successful social interaction relies on multimodal integration of non-verbal emotional signals. The neural correlates of this function, along with those underlying the processing of human faces and voices, have been linked to the superior temporal sulcus (STS) in previous neuroimaging studies. Yet, recently it has been demonstrated that this structure consists of several anatomically defined sections, including a trunk section as well as two separate terminal branches, and exhibits a pronounced spatial variability across subjects. Using functional magnetic resonance imaging (fMRI), we demonstrated that the neural representations of the audiovisual integration of non-verbal emotional signals, voice sensitivity and face sensitivity are located in different parts of the STS with maximum voice sensitivity in the trunk section and maximum face sensitivity in the posterior terminal ascending branch. The audiovisual integration area for emotional signals is located at the bifurcation of the STS at an overlap of voice- and face-sensitive regions. In summary, our findings evidence a functional subdivision of the STS into modules subserving the processing of different aspects of social communication, here exemplified in human voices and faces and audiovisual integration of emotional signals from these sources and suggest a possible interaction of the underlying voice- and face-sensitive neuronal populations during the formation of the audiovisual emotional percept.


Subject(s)
Emotions/physiology , Face , Magnetic Resonance Imaging , Temporal Lobe/blood supply , Temporal Lobe/physiology , Voice , Acoustic Stimulation/methods , Adolescent , Adult , Analysis of Variance , Brain Mapping , Female , Humans , Image Processing, Computer-Assisted/methods , Male , Oxygen/blood , Photic Stimulation/methods , Self Concept , Sensitivity and Specificity , Sex Factors , Young Adult
10.
Dement Geriatr Cogn Disord ; 27(2): 117-32, 2009.
Article in English | MEDLINE | ID: mdl-19182479

ABSTRACT

OBJECTIVE: We investigated healthy controls (HCs), and patients with mild cognitive impairment (MCI) and early Alzheimer's disease (AD) to identify neuronal correlates of clock time representation and changes resulting from neurodegenerative processes using functional magnetic resonance imaging. METHODS: Two clock-specific tasks demanding conceptual knowledge of clock hands, i.e. a minute hand and an hour hand task, were compared with a semantic control task. RESULTS: We observed that the minute hand task provoked a stronger activation of areas in the parietal lobes known to be involved in spatial mental imagery, while the semantic task primarily activated regions of the superior temporal lobes associated with verbal conceptual knowledge. The performance of the MCI group did not differ from that of the HC group, but additional activation was found in several brain regions. Decreased activation was detected during the minute hand task in the right middle temporal gyrus. Patients with early AD showed deteriorated performance in both clock tasks along with reduced activation in the occipital lobes and the left fusiform gyrus. Additional activation was detected in the precuneus. CONCLUSIONS: The fusiform gyrus might be crucial for the visual-semantic retrieval of clock time representation. In patients with early AD, access to this visual-semantic knowledge appears to be reduced.


Subject(s)
Alzheimer Disease/psychology , Cerebral Cortex/physiopathology , Cognition Disorders/psychology , Neuropsychological Tests , Psychomotor Performance/physiology , Aged , Alzheimer Disease/physiopathology , Analysis of Variance , Cognition Disorders/physiopathology , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Nerve Net/physiopathology , Reaction Time/physiology
11.
J Cogn Neurosci ; 21(7): 1255-68, 2009 Jul.
Article in English | MEDLINE | ID: mdl-18752404

ABSTRACT

We investigated the functional characteristics of brain regions implicated in processing of speech melody by presenting words spoken in either neutral or angry prosody during a functional magnetic resonance imaging experiment using a factorial habituation design. Subjects judged either affective prosody or word class for these vocal stimuli, which could be heard for either the first, second, or third time. Voice-sensitive temporal cortices, as well as the amygdala, insula, and mediodorsal thalami, reacted stronger to angry than to neutral prosody. These stimulus-driven effects were not influenced by the task, suggesting that these brain structures are automatically engaged during processing of emotional information in the voice and operate relatively independent of cognitive demands. By contrast, the right middle temporal gyrus and the bilateral orbito-frontal cortices (OFC) responded stronger during emotion than word classification, but were also sensitive to anger expressed by the voices, suggesting that some perceptual aspects of prosody are also encoded within these regions subserving explicit processing of vocal emotion. The bilateral OFC showed a selective modulation by emotion and repetition, with particularly pronounced responses to angry prosody during the first presentation only, indicating a critical role of the OFC in detection of vocal information that is both novel and behaviorally relevant. These results converge with previous findings obtained for angry faces and suggest a general involvement of the OFC for recognition of anger irrespective of the sensory modality. Taken together, our study reveals that different aspects of voice stimuli and perceptual demands modulate distinct areas involved in the processing of emotional prosody.


Subject(s)
Brain Mapping , Brain/physiology , Emotions/physiology , Linguistics , Speech Perception/physiology , Acoustic Stimulation/methods , Adult , Analysis of Variance , Arousal/physiology , Auditory Pathways/blood supply , Auditory Pathways/physiology , Brain/blood supply , Female , Functional Laterality/physiology , Habituation, Psychophysiologic , Humans , Image Processing, Computer-Assisted/methods , Judgment/physiology , Magnetic Resonance Imaging/methods , Male , Oxygen/blood , Reaction Time/physiology , Voice , Young Adult
12.
Neuroimage ; 39(2): 885-93, 2008 Jan 15.
Article in English | MEDLINE | ID: mdl-17964813

ABSTRACT

The human brain has a preference for processing of emotionally salient stimuli. In the auditory modality, emotional prosody can induce such involuntary biasing of processing resources. To investigate the neural correlates underlying automatic processing of emotional information in the voice, words spoken in neutral, happy, erotic, angry, and fearful prosody were presented in a passive-listening functional magnetic resonance imaging (fMRI) experiment. Hemodynamic responses in right mid superior temporal gyrus (STG) were significantly stronger for all emotional than for neutral intonations. To disentangle the contribution of basic acoustic features and emotional arousal to this activation, the relation between event-related responses and these parameters was evaluated by means of regression analyses. A significant linear dependency between hemodynamic responses of right mid STG and mean intensity, mean fundamental frequency, variability of fundamental frequency, duration, and arousal of the stimuli was observed. While none of the acoustic parameters alone explained the stronger responses of right mid STG to emotional relative to neutral prosody, this stronger responsiveness was abolished both by correcting for arousal or the conjoint effect of the acoustic parameters. In conclusion, our results demonstrate that right mid STG is sensitive to various emotions conveyed by prosody, an effect which is driven by a combination of acoustic features that express the emotional arousal in the speaker's voice.


Subject(s)
Arousal/physiology , Auditory Perception/physiology , Emotions/physiology , Mental Processes/physiology , Acoustic Stimulation , Adult , Cerebrovascular Circulation/physiology , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Oxygen/blood , Reading , Regression Analysis
13.
Neuroimage ; 37(4): 1445-56, 2007 Oct 01.
Article in English | MEDLINE | ID: mdl-17659885

ABSTRACT

In a natural environment, non-verbal emotional communication is multimodal (i.e. speech melody, facial expression) and multifaceted concerning the variety of expressed emotions. Understanding these communicative signals and integrating them into a common percept is paramount to successful social behaviour. While many previous studies have focused on the neurobiology of emotional communication in the auditory or visual modality alone, far less is known about multimodal integration of auditory and visual non-verbal emotional information. The present study investigated this process using event-related fMRI. Behavioural data revealed that audiovisual presentation of non-verbal emotional information resulted in a significant increase in correctly classified stimuli when compared with visual and auditory stimulation. This behavioural gain was paralleled by enhanced activation in bilateral posterior superior temporal gyrus (pSTG) and right thalamus, when contrasting audiovisual to auditory and visual conditions. Further, a characteristic of these brain regions, substantiating their role in the emotional integration process, is a linear relationship between the gain in classification accuracy and the strength of the BOLD response during the bimodal condition. Additionally, enhanced effective connectivity between audiovisual integration areas and associative auditory and visual cortices was observed during audiovisual stimulation, offering further insight into the neural process accomplishing multimodal integration. Finally, we were able to document an enhanced sensitivity of the putative integration sites to stimuli with emotional non-verbal content as compared to neutral stimuli.


Subject(s)
Emotions/physiology , Facial Expression , Social Perception , Voice , Adult , Cerebrovascular Circulation/physiology , Electrophysiology , Evoked Potentials/physiology , Female , Functional Laterality/physiology , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Oxygen/blood , Photic Stimulation , Temporal Lobe/blood supply , Temporal Lobe/physiology , Thalamus/blood supply , Thalamus/physiology
14.
Neuroreport ; 17(3): 249-53, 2006 Feb 27.
Article in English | MEDLINE | ID: mdl-16462592

ABSTRACT

Functional magnetic resonance imaging was used to investigate hemodynamic responses to adjectives pronounced in happy and angry intonations of varying emotional intensity. In separate sessions, participants judged the emotional valence of either intonation or semantics. To disentangle effects of emotional prosodic intensity from confounding acoustic parameters, mean and variability of volume and fundamental frequency of each stimulus were included as nuisance variables in the statistical models. A linear dependency between hemodynamic responses and emotional intensity of happy and angry intonations was found in the bilateral superior temporal sulcus during both tasks, indicating that increases of hemodynamic responses in this region are elicited by both positive and negative prosody independent of low-level acoustic properties and task instructions.


Subject(s)
Auditory Cortex/physiology , Emotions/physiology , Speech Perception/physiology , Acoustic Stimulation/methods , Adult , Analysis of Variance , Auditory Cortex/blood supply , Brain Mapping , Female , Humans , Image Processing, Computer-Assisted/methods , Magnetic Resonance Imaging/methods , Male , Oxygen/blood
15.
Hum Brain Mapp ; 27(9): 707-14, 2006 Sep.
Article in English | MEDLINE | ID: mdl-16411179

ABSTRACT

Emotional information can be conveyed by various means of communication, such as propositional content, speech intonation, facial expression, and gestures. Prior studies have demonstrated that inputs from one modality can alter perception in another modality. To evaluate the impact of emotional intonation on ratings of emotional faces, a behavioral study first was carried out. Second, functional magnetic resonance (fMRI) was used to identify brain regions that mediate crossmodal effects of emotional prosody on judgments of facial expressions. In the behavioral study, subjects rated fearful and neutral facial expressions as being more fearful when accompanied by a fearful voice as compared to the same facial expressions without concomitant auditory stimulus, whereas no such influence on rating of faces was found for happy voices. In the fMRI experiment, this shift in rating of facial expressions in presence of a fearfully spoken sentence was correlated with the hemodynamic response in the left amygdala extending into the periamygdaloid cortex, which suggests that crossmodal effects on cognitive judgments of emotional information are mediated via these neuronal structures. Furthermore, significantly stronger activations were found in the mid-portion of the right fusiform gyrus during judgment of facial expressions in presence of fearful as compared to happy intonations, indicating that enhanced processing of faces within this region can be induced by the presence of threat-related information perceived via the auditory modality. Presumably, these increased extrastriate activations correspond to enhanced alertness, whereas responses within the left amygdala modulate cognitive evaluation of emotional facial expressions.


Subject(s)
Brain Mapping , Brain/physiology , Emotions/physiology , Facial Expression , Judgment/physiology , Voice/physiology , Acoustic Stimulation , Adolescent , Adult , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Perception
16.
Neuroreport ; 16(3): 239-42, 2005 Feb 28.
Article in English | MEDLINE | ID: mdl-15706227

ABSTRACT

To investigate lateralization of duration and pitch discrimination processing with emphasis on the influences of task difficulty, we used event-related functional magnetic resonance imaging. Seventeen healthy volunteers performed paired auditory discrimination tasks at varying levels of difficulty. Analysis of lateralization effects revealed leftward lateralization within the insular and the temporal cortex under both conditions. Moreover, parametric analysis of haemodynamic responses showed increasing activation within the right temporal cortex correlated to increasing accuracy of stimulus discrimination. Thus, highly differential acoustic stimuli seem to be predominantly processed within the right hemisphere, whereas the detection of slight signal differences might be linked to the left hemisphere. In conclusion, we found evidence for preferential involvement of the right hemisphere in holistic feature processing within the auditory domain.


Subject(s)
Cerebral Cortex/physiology , Functional Laterality/physiology , Pitch Discrimination/physiology , Time Perception/physiology , Acoustic Stimulation , Adolescent , Adult , Brain Mapping , Cerebral Cortex/blood supply , Dose-Response Relationship, Radiation , Female , Humans , Magnetic Resonance Imaging/methods , Male , Oxygen/blood , Task Performance and Analysis , Time Factors
17.
Magn Reson Med ; 50(6): 1296-301, 2003 Dec.
Article in English | MEDLINE | ID: mdl-14648578

ABSTRACT

In vivo longitudinal relaxation times of N-acetyl compounds (NA), choline-containing substances (Cho), creatine (Cr), myo-inositol (mI), and tissue water were measured at 1.5 and 3 T using a point-resolved spectroscopy (PRESS) sequence with short echo time (TE). T(1) values were determined in six different brain regions: the occipital gray matter (GM), occipital white matter (WM), motor cortex, frontoparietal WM, thalamus, and cerebellum. The T(1) relaxation times of water protons were 26-38% longer at 3 T than at 1.5 T. Significantly longer metabolite T(1) values at 3 T (11-36%) were found for NA, Cho, and Cr in the motor cortex, frontoparietal WM, and thalamus. The amounts of GM, WM, and cerebrospinal fluid (CSF) within the voxel were determined by segmentation of a 3D image data set. No influence of tissue composition on metabolite T(1) values was found, while the longitudinal relaxation times of water protons were strongly correlated with the relative GM content.


Subject(s)
Aspartic Acid/analogs & derivatives , Brain Chemistry , Magnetic Resonance Spectroscopy , Adult , Aspartic Acid/analysis , Cerebellum/chemistry , Choline/analysis , Creatine/analysis , Female , Frontal Lobe/chemistry , Humans , Male , Motor Cortex/chemistry , Occipital Lobe/chemistry , Parietal Lobe/chemistry , Thalamus/chemistry
SELECTION OF CITATIONS
SEARCH DETAIL