Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 201
Filter
Add more filters

Country/Region as subject
Publication year range
1.
J Stroke Cerebrovasc Dis ; 29(7): 104827, 2020 Jul.
Article in English | MEDLINE | ID: mdl-32386850

ABSTRACT

Cortical deafness is an extremely rare clinical manifestation that originates mainly from bilateral cortical lesions in the primary auditory cortex. Its main clinical manifestation is the bilateral sudden loss of hearing. Diagnosis is difficulty due to its rarity and similarity with other language and communication disorders, such as Wernicke's aphasia, auditory agnosia or verbal deafness. Herein, we present a case report of a young woman with a sudden bilateral loss of auditory comprehension. Initially, a psychiatric nature of the disorder was considered, but the persistence of the symptoms, lead to the diagnosis of cortical deafness secondary to bilateral ischemic lesions in both temporal lobes. Progressive improvement occurred and three months after the initial manifestations she manifested pure verbal deafness. Cortical deafness usually has a poor functional prognosis, with limited therapeutic options. Rehabilitation and speech therapy is recommended to improve the chance of patients achieving communication skills.


Subject(s)
Auditory Cortex/blood supply , Auditory Perception , Hearing Loss, Bilateral/etiology , Hearing Loss, Central/etiology , Hearing , Stroke/complications , Adult , Female , Hearing Loss, Bilateral/diagnosis , Hearing Loss, Bilateral/physiopathology , Hearing Loss, Bilateral/rehabilitation , Hearing Loss, Central/diagnosis , Hearing Loss, Central/physiopathology , Hearing Loss, Central/rehabilitation , Humans , Recovery of Function , Stroke/diagnosis , Stroke/physiopathology , Stroke/therapy , Stroke Rehabilitation , Treatment Outcome
2.
J Neurosci ; 36(4): 1416-28, 2016 Jan 27.
Article in English | MEDLINE | ID: mdl-26818527

ABSTRACT

Functional and anatomical studies have clearly demonstrated that auditory cortex is populated by multiple subfields. However, functional characterization of those fields has been largely the domain of animal electrophysiology, limiting the extent to which human and animal research can inform each other. In this study, we used high-resolution functional magnetic resonance imaging to characterize human auditory cortical subfields using a variety of low-level acoustic features in the spectral and temporal domains. Specifically, we show that topographic gradients of frequency preference, or tonotopy, extend along two axes in human auditory cortex, thus reconciling historical accounts of a tonotopic axis oriented medial to lateral along Heschl's gyrus and more recent findings emphasizing tonotopic organization along the anterior-posterior axis. Contradictory findings regarding topographic organization according to temporal modulation rate in acoustic stimuli, or "periodotopy," are also addressed. Although isolated subregions show a preference for high rates of amplitude-modulated white noise (AMWN) in our data, large-scale "periodotopic" organization was not found. Organization by AM rate was correlated with dominant pitch percepts in AMWN in many regions. In short, our data expose early auditory cortex chiefly as a frequency analyzer, and spectral frequency, as imposed by the sensory receptor surface in the cochlea, seems to be the dominant feature governing large-scale topographic organization across human auditory cortex. SIGNIFICANCE STATEMENT: In this study, we examine the nature of topographic organization in human auditory cortex with fMRI. Topographic organization by spectral frequency (tonotopy) extended in two directions: medial to lateral, consistent with early neuroimaging studies, and anterior to posterior, consistent with more recent reports. Large-scale organization by rates of temporal modulation (periodotopy) was correlated with confounding spectral content of amplitude-modulated white-noise stimuli. Together, our results suggest that the organization of human auditory cortex is driven primarily by its response to spectral acoustic features, and large-scale periodotopy spanning across multiple regions is not supported. This fundamental information regarding the functional organization of early auditory cortex will inform our growing understanding of speech perception and the processing of other complex sounds.


Subject(s)
Auditory Cortex/physiology , Auditory Perception/physiology , Brain Mapping , Acoustics , Adult , Auditory Cortex/blood supply , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Neuropsychological Tests , Oxygen/blood , Sound , Spectrum Analysis , Time Factors , Young Adult
3.
Hum Brain Mapp ; 38(3): 1140-1154, 2017 03.
Article in English | MEDLINE | ID: mdl-27790786

ABSTRACT

A tonotopic organization of the human auditory cortex (AC) has been reliably found by neuroimaging studies. However, a full characterization and parcellation of the AC is still lacking. In this study, we employed pseudo-continuous arterial spin labeling (pCASL) to map tonotopy and voice selective regions using, for the first time, cerebral blood flow (CBF). We demonstrated the feasibility of CBF-based tonotopy and found a good agreement with BOLD signal-based tonotopy, despite the lower contrast-to-noise ratio of CBF. Quantitative perfusion mapping of baseline CBF showed a region of high perfusion centered on Heschl's gyrus and corresponding to the main high-low-high frequency gradients, co-located to the presumed primary auditory core and suggesting baseline CBF as a novel marker for AC parcellation. Furthermore, susceptibility weighted imaging was employed to investigate the tissue specificity of CBF and BOLD signal and the possible venous bias of BOLD-based tonotopy. For BOLD only active voxels, we found a higher percentage of vein contamination than for CBF only active voxels. Taken together, we demonstrated that both baseline and stimulus-induced CBF is an alternative fMRI approach to the standard BOLD signal to study auditory processing and delineate the functional organization of the auditory cortex. Hum Brain Mapp 38:1140-1154, 2017. © 2016 Wiley Periodicals, Inc.


Subject(s)
Auditory Cortex/blood supply , Auditory Cortex/diagnostic imaging , Brain Mapping , Cerebrovascular Circulation/physiology , Acoustic Stimulation , Adult , Arteries , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Oxygen/blood , Spectrum Analysis , Spin Labels , Time Factors
4.
Eur J Neurosci ; 43(6): 773-81, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26751256

ABSTRACT

Linguistic units such as phonemes and syllables are important for speech perception. How the brain encodes these units is not well understood. Many neuroimaging studies have found distinct representations of consonant-vowel syllables that shared one phoneme and differed in the other phoneme (e.g. /ba/ and /da/), but it is unclear whether this discrimination ability is due to the neural coding of phonemes or syllables. We combined functional magnetic resonance imaging with multivariate pattern analysis to explore this question. Subjects listened to nine Mandarin syllables in a consonant-vowel form. We successfully decoded phonemes from the syllables based on the blood oxygenation level-dependent signals in the superior temporal gyrus (STG). Specifically, a classifier trained on the cortical patterns elicited by a set of syllables, which contained two phonemes, could distinguish the cortical patterns elicited by other syllables that contained the two phonemes. The results indicated that phonemes have unique representations in the STG. In addition, there was a categorical effect, i.e. the cortical patterns of consonants were similar, and so were the cortical patterns of vowels. Further analysis showed that phonemes exhibited stronger encoding specificity in the mid-STG than in the anterior STG.


Subject(s)
Auditory Cortex/physiology , Brain Mapping , Oxygen/blood , Speech Perception , Adult , Auditory Cortex/blood supply , Humans , Linguistics , Male
5.
Cereb Cortex ; 25(11): 4248-58, 2015 Nov.
Article in English | MEDLINE | ID: mdl-25577576

ABSTRACT

In spatial perception, visual information has higher acuity than auditory information and we often misperceive sound-source locations when spatially disparate visual stimuli are presented simultaneously. Ventriloquists make good use of this auditory illusion. In this study, we investigated neural substrates of the ventriloquism effect to understand the neural mechanism of multimodal integration. This study was performed in 2 steps. First, we investigated how sound locations were represented in the auditory cortex. Secondly, we investigated how simultaneous presentation of spatially disparate visual stimuli affects neural processing of sound locations. Based on the population rate code hypothesis that assumes monotonic sensitivity to sound azimuth across populations of broadly tuned neurons, we expected a monotonic increase of blood oxygenation level-dependent (BOLD) signals for more contralateral sounds. Consistent with this hypothesis, we found that BOLD signals in the posterior superior temporal gyrus increased monotonically as a function of sound azimuth. We also observed attenuation of the monotonic azimuthal sensitivity by spatially disparate visual stimuli. The alteration of the neural pattern was considered to reflect the neural mechanism of the ventriloquism effect. Our findings indicate that conflicting audiovisual spatial information of an event is associated with an attenuation of neural processing of auditory spatial localization.


Subject(s)
Auditory Cortex/physiology , Brain Mapping , Sound Localization/physiology , Space Perception/physiology , Acoustic Stimulation , Adult , Auditory Cortex/blood supply , Female , Functional Laterality , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Oxygen/blood , Photic Stimulation , Psychoacoustics , Reaction Time/physiology , Young Adult
6.
J Neurosci ; 34(13): 4548-57, 2014 Mar 26.
Article in English | MEDLINE | ID: mdl-24672000

ABSTRACT

Selective attention to relevant sound properties is essential for everyday listening situations. It enables the formation of different perceptual representations of the same acoustic input and is at the basis of flexible and goal-dependent behavior. Here, we investigated the role of the human auditory cortex in forming behavior-dependent representations of sounds. We used single-trial fMRI and analyzed cortical responses collected while subjects listened to the same speech sounds (vowels /a/, /i/, and /u/) spoken by different speakers (boy, girl, male) and performed a delayed-match-to-sample task on either speech sound or speaker identity. Univariate analyses showed a task-specific activation increase in the right superior temporal gyrus/sulcus (STG/STS) during speaker categorization and in the right posterior temporal cortex during vowel categorization. Beyond regional differences in activation levels, multivariate classification of single trial responses demonstrated that the success with which single speakers and vowels can be decoded from auditory cortical activation patterns depends on task demands and subject's behavioral performance. Speaker/vowel classification relied on distinct but overlapping regions across the (right) mid-anterior STG/STS (speakers) and bilateral mid-posterior STG/STS (vowels), as well as the superior temporal plane including Heschl's gyrus/sulcus. The task dependency of speaker/vowel classification demonstrates that the informative fMRI response patterns reflect the top-down enhancement of behaviorally relevant sound representations. Furthermore, our findings suggest that successful selection, processing, and retention of task-relevant sound properties relies on the joint encoding of information across early and higher-order regions of the auditory cortex.


Subject(s)
Auditory Cortex/physiology , Phonetics , Speech Perception/physiology , Acoustic Stimulation/methods , Adult , Auditory Cortex/blood supply , Brain Mapping , Female , Functional Laterality , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Oxygen , Psychoacoustics , Sound Spectrography , Young Adult
7.
J Neurosci ; 34(24): 8072-82, 2014 Jun 11.
Article in English | MEDLINE | ID: mdl-24920613

ABSTRACT

The neural mechanisms that produce hallucinations and other psychotic symptoms remain unclear. Previous research suggests that deficits in predictive signals for learning, such as prediction error signals, may underlie psychotic symptoms, but the mechanism by which such deficits produce psychotic symptoms remains to be established. We used model-based fMRI to study sensory prediction errors in human patients with schizophrenia who report daily auditory verbal hallucinations (AVHs) and sociodemographically matched healthy control subjects. We manipulated participants' expectations for hearing speech at different periods within a speech decision-making task. Patients activated a voice-sensitive region of the auditory cortex while they experienced AVHs in the scanner and displayed a concomitant deficit in prediction error signals in a similar portion of auditory cortex. This prediction error deficit correlated strongly with increased activity during silence and with reduced volumes of the auditory cortex, two established neural phenotypes of AVHs. Furthermore, patients with more severe AVHs had more deficient prediction error signals and greater activity during silence within the region of auditory cortex where groups differed, regardless of the severity of psychotic symptoms other than AVHs. Our findings suggest that deficient predictive coding accounts for the resting hyperactivity in sensory cortex that leads to hallucinations.


Subject(s)
Auditory Cortex/physiopathology , Hallucinations/etiology , Schizophrenia/complications , Schizophrenia/diagnosis , Speech Perception/physiology , Acoustic Stimulation , Adult , Auditory Cortex/blood supply , Brain Mapping , Case-Control Studies , Decision Making , Female , Humans , Image Processing, Computer-Assisted , Male , Middle Aged , Neuropsychological Tests , Oxygen/blood , Predictive Value of Tests , Schizophrenic Psychology , Time Factors
8.
Nature ; 457(7228): 475-9, 2009 Jan 22.
Article in English | MEDLINE | ID: mdl-19158795

ABSTRACT

Haemodynamic signals underlying functional brain imaging (for example, functional magnetic resonance imaging (fMRI)) are assumed to reflect metabolic demand generated by local neuronal activity, with equal increases in haemodynamic signal implying equal increases in the underlying neuronal activity. Few studies have compared neuronal and haemodynamic signals in alert animals to test for this assumed correspondence. Here we present evidence that brings this assumption into question. Using a dual-wavelength optical imaging technique that independently measures cerebral blood volume and oxygenation, continuously, in alert behaving monkeys, we find two distinct components to the haemodynamic signal in the alert animals' primary visual cortex (V1). One component is reliably predictable from neuronal responses generated by visual input. The other component-of almost comparable strength-is a hitherto unknown signal that entrains to task structure independently of visual input or of standard neural predictors of haemodynamics. This latter component shows predictive timing, with increases of cerebral blood volume in anticipation of trial onsets even in darkness. This trial-locked haemodynamic signal could be due to an accompanying V1 arterial pumping mechanism, closely matched in time, with peaks of arterial dilation entrained to predicted trial onsets. These findings (tested in two animals) challenge the current understanding of the link between brain haemodynamics and local neuronal activity. They also suggest the existence of a novel preparatory mechanism in the brain that brings additional arterial blood to cortex in anticipation of expected tasks.


Subject(s)
Cerebrovascular Circulation , Hemodynamics , Macaca mulatta/physiology , Neurons/physiology , Visual Cortex/blood supply , Visual Cortex/physiology , Acoustic Stimulation , Animals , Auditory Cortex/blood supply , Auditory Cortex/cytology , Auditory Cortex/physiology , Brain Mapping , Darkness , Fixation, Ocular/physiology , Magnetic Resonance Imaging , Models, Neurological , Oxygen Consumption/physiology , Photic Stimulation , Reproducibility of Results , Time Factors , Visual Cortex/cytology
9.
Hum Brain Mapp ; 35(7): 3188-98, 2014 Jul.
Article in English | MEDLINE | ID: mdl-24142547

ABSTRACT

Our understanding of cerebral blood flow (CBF) in the healthy developing brain has been limited due to the invasiveness of methods historically available for CBF measurement. Clinically based studies using radioactive tracers with children have focused on resting state CBF. Yet potential age-related changes in flow during stimulation may affect the blood oxygenation level dependent (BOLD) response used to investigate cognitive neurodevelopment. This study used noninvasive arterial spin labeling magnetic resonance imaging to compare resting state and stimulus-driven CBF between typically developing children 8 years of age, 12 years of age, and adults. Further, we acquired functional CBF and BOLD images simultaneously to examine their relationship during sensory stimulation. Analyses revealed age-related CBF differences during rest; the youngest group showed greater CBF than 12-year-olds or adults. During stimulation of the auditory cortex, younger children also showed a greater absolute increase in CBF than adults. However, the magnitude of CBF response above baseline was comparable between groups. Similarly, the amplitude of the BOLD response was stable across age. The combination of the 8 year olds' elevated CBF, both at rest and in response to stimulation, without elevation in the BOLD response suggests that additional physiological factors that also play a role in the BOLD effect, such as metabolic processes that are also elevated in this period, may offset the increased CBF in these children. Thus, CBF measurements reveal maturational differences in the hemodynamics underlying the BOLD effect in children despite the resemblance of the BOLD response between children and adults.


Subject(s)
Aging , Auditory Cortex/blood supply , Auditory Cortex/growth & development , Oxygen/blood , Rest/physiology , Acoustic Stimulation , Adolescent , Adult , Child , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Young Adult
10.
J Neurosci ; 32(28): 9626-38, 2012 Jul 11.
Article in English | MEDLINE | ID: mdl-22787048

ABSTRACT

The developing brain responds to the environment by using statistical correlations in input to guide functional and structural changes-that is, the brain displays neuroplasticity. Experience shapes brain development throughout life, but neuroplasticity is variable from one brain system to another. How does the early loss of a sensory modality affect this complex process? We examined cross-modal neuroplasticity in anatomically defined subregions of Heschl's gyrus, the site of human primary auditory cortex, in congenitally deaf humans by measuring the fMRI signal change in response to spatially coregistered visual, somatosensory, and bimodal stimuli. In the deaf Heschl's gyrus, signal change was greater for somatosensory and bimodal stimuli than that of hearing participants. Visual responses in Heschl's gyrus, larger in deaf than hearing, were smaller than those elicited by somatosensory stimulation. In contrast to Heschl's gyrus, in the superior-temporal cortex visual signal was comparable to somatosensory signal. In addition, deaf adults perceived bimodal stimuli differently; in contrast to hearing adults, they were susceptible to a double-flash visual illusion induced by two touches to the face. Somatosensory and bimodal signal change in rostrolateral Heschl's gyrus predicted the strength of the visual illusion in the deaf adults in line with the interpretation that the illusion is a functional consequence of the altered cross-modal organization observed in deaf auditory cortex. Our results demonstrate that congenital and profound deafness alters how vision and somatosensation are processed in primary auditory cortex.


Subject(s)
Auditory Cortex/blood supply , Brain Mapping , Deafness/pathology , Illusions/physiology , Magnetic Resonance Imaging , Acoustic Stimulation , Adult , Analysis of Variance , Auditory Cortex/pathology , Deafness/congenital , Female , Humans , Image Processing, Computer-Assisted , Male , Middle Aged , Models, Biological , Oxygen/blood , Photic Stimulation , Psychophysics , Young Adult
11.
J Neurosci ; 32(38): 13273-80, 2012 Sep 19.
Article in English | MEDLINE | ID: mdl-22993443

ABSTRACT

The formation of new sound categories is fundamental to everyday goal-directed behavior. Categorization requires the abstraction of discrete classes from continuous physical features as required by context and task. Electrophysiology in animals has shown that learning to categorize novel sounds alters their spatiotemporal neural representation at the level of early auditory cortex. However, functional magnetic resonance imaging (fMRI) studies so far did not yield insight into the effects of category learning on sound representations in human auditory cortex. This may be due to the use of overlearned speech-like categories and fMRI subtraction paradigms, leading to insufficient sensitivity to distinguish the responses to learning-induced, novel sound categories. Here, we used fMRI pattern analysis to investigate changes in human auditory cortical response patterns induced by category learning. We created complex novel sound categories and analyzed distributed activation patterns during passive listening to a sound continuum before and after category learning. We show that only after training, sound categories could be successfully decoded from early auditory areas and that learning-induced pattern changes were specific to the category-distinctive sound feature (i.e., pitch). Notably, the similarity between fMRI response patterns for the sound continuum mirrored the sigmoid shape of the behavioral category identification function. Our results indicate that perceptual representations of novel sound categories emerge from neural changes at early levels of the human auditory processing hierarchy.


Subject(s)
Auditory Cortex/physiology , Auditory Perception/physiology , Brain Mapping , Learning/physiology , Sound , Acoustic Stimulation/classification , Adult , Analysis of Variance , Auditory Cortex/blood supply , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Normal Distribution , Oxygen/blood , Psychoacoustics , Spectrum Analysis , Young Adult
12.
J Neurosci ; 32(12): 4260-70, 2012 Mar 21.
Article in English | MEDLINE | ID: mdl-22442088

ABSTRACT

We compared brain structure and function in two subgroups of 21 stroke patients with either moderate or severe chronic speech comprehension impairment. Both groups had damage to the supratemporal plane; however, the severe group suffered greater damage to two unimodal auditory areas: primary auditory cortex and the planum temporale. The effects of this damage were investigated using fMRI while patients listened to speech and speech-like sounds. Pronounced changes in connectivity were found in both groups in undamaged parts of the auditory hierarchy. Compared to controls, moderate patients had significantly stronger feedback connections from planum temporale to primary auditory cortex bilaterally, while in severe patients this connection was significantly weaker in the undamaged right hemisphere. This suggests that predictive feedback mechanisms compensate in moderately affected patients but not in severely affected patients. The key pathomechanism in humans with persistent speech comprehension impairments may be impaired feedback connectivity to unimodal auditory areas.


Subject(s)
Auditory Cortex , Brain Mapping , Speech Disorders/etiology , Speech Disorders/pathology , Speech Perception/physiology , Stroke/complications , Acoustic Stimulation/methods , Adult , Aged , Aged, 80 and over , Auditory Cortex/blood supply , Auditory Cortex/pathology , Auditory Cortex/physiopathology , Auditory Pathways/blood supply , Auditory Pathways/pathology , Auditory Pathways/physiopathology , Comprehension , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Middle Aged , Models, Statistical , Nonlinear Dynamics , Oxygen/blood
13.
J Cogn Neurosci ; 25(7): 1062-77, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23410032

ABSTRACT

This study investigates the functional neuroanatomy of harmonic music perception with fMRI. We presented short pieces of Western classical music to nonmusicians. The ending of each piece was systematically manipulated in the following four ways: Standard Cadence (expected resolution), Deceptive Cadence (moderate deviation from expectation), Modulated Cadence (strong deviation from expectation but remaining within the harmonic structure of Western tonal music), and Atonal Cadence (strongest deviation from expectation by leaving the harmonic structure of Western tonal music). Music compared with baseline broadly recruited regions of the bilateral superior temporal gyrus (STG) and the right inferior frontal gyrus (IFG). Parametric regressors scaled to the degree of deviation from harmonic expectancy identified regions sensitive to expectancy violation. Areas within the BG were significantly modulated by expectancy violation, indicating a previously unappreciated role in harmonic processing. Expectancy violation also recruited bilateral cortical regions in the IFG and anterior STG, previously associated with syntactic processing in other domains. The posterior STG was not significantly modulated by expectancy. Granger causality mapping found functional connectivity between IFG, anterior STG, posterior STG, and the BG during music perception. Our results imply the IFG, anterior STG, and the BG are recruited for higher-order harmonic processing, whereas the posterior STG is recruited for basic pitch and melodic processing.


Subject(s)
Auditory Cortex/physiology , Auditory Perception/physiology , Brain Mapping , Corpus Striatum/physiology , Music , Acoustic Stimulation , Analysis of Variance , Auditory Cortex/blood supply , Corpus Striatum/blood supply , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Oxygen , Photic Stimulation
14.
J Cogn Neurosci ; 25(5): 730-42, 2013 May.
Article in English | MEDLINE | ID: mdl-23249352

ABSTRACT

Psychophysical experiments show that auditory change detection can be disturbed in situations in which listeners have to monitor complex auditory input. We made use of this change deafness effect to segregate the neural correlates of physical change in auditory input from brain responses related to conscious change perception in an fMRI experiment. Participants listened to two successively presented complex auditory scenes, which consisted of six auditory streams, and had to decide whether scenes were identical or whether the frequency of one stream was changed between presentations. Our results show that physical changes in auditory input, independent of successful change detection, are represented at the level of auditory cortex. Activations related to conscious change perception, independent of physical change, were found in the insula and the ACC. Moreover, our data provide evidence for significant effective connectivity between auditory cortex and the insula in the case of correctly detected auditory changes, but not for missed changes. This underlines the importance of the insula/anterior cingulate network for conscious change detection.


Subject(s)
Auditory Cortex/physiology , Brain Mapping , Pitch Perception/physiology , Sound Localization/physiology , Acoustic Stimulation , Adult , Analysis of Variance , Auditory Cortex/blood supply , Auditory Pathways/blood supply , Auditory Pathways/physiology , Female , Functional Laterality , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Oxygen/blood , Psychophysics , Reaction Time/physiology , Time Factors , Young Adult
15.
J Cogn Neurosci ; 25(9): 1553-62, 2013 Sep.
Article in English | MEDLINE | ID: mdl-23647558

ABSTRACT

In the visual modality, perceptual demand on a goal-directed task has been shown to modulate the extent to which irrelevant information can be disregarded at a sensory-perceptual stage of processing. In the auditory modality, the effect of perceptual demand on neural representations of task-irrelevant sounds is unclear. We compared simultaneous ERPs and fMRI responses associated with task-irrelevant sounds across parametrically modulated perceptual task demands in a dichotic-listening paradigm. Participants performed a signal detection task in one ear (Attend ear) while ignoring task-irrelevant syllable sounds in the other ear (Ignore ear). Results revealed modulation of syllable processing by auditory perceptual demand in an ROI in middle left superior temporal gyrus and in negative ERP activity 130-230 msec post stimulus onset. Increasing the perceptual demand in the Attend ear was associated with a reduced neural response in both fMRI and ERP to task-irrelevant sounds. These findings are in support of a selection model whereby ongoing perceptual demands modulate task-irrelevant sound processing in auditory cortex.


Subject(s)
Auditory Cortex/physiology , Auditory Perception/physiology , Brain Mapping , Evoked Potentials, Auditory/physiology , Sound , Acoustic Stimulation , Adult , Analysis of Variance , Auditory Cortex/blood supply , Dichotic Listening Tests , Electroencephalography , Female , Functional Laterality , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Oxygen/blood , Psychoacoustics , Reaction Time/physiology , Young Adult
16.
Cereb Cortex ; 22(4): 745-53, 2012 Apr.
Article in English | MEDLINE | ID: mdl-21709174

ABSTRACT

Human neuroimaging studies have identified a region of auditory cortex, lateral Heschl's gyrus (HG), that shows a greater response to iterated ripple noise (IRN) than to a Gaussian noise control. Based in part on results using IRN as a pitch-evoking stimulus, it has been argued that lateral HG is a general "pitch center." However, IRN contains slowly varying spectrotemporal modulations, unrelated to pitch, that are not found in the control stimulus. Hence, it is possible that the cortical response to IRN is driven in part by these modulations. The current study reports the first attempt to control for these modulations. This was achieved using a novel type of stimulus that was generated by processing IRN to remove the fine temporal structure (and thus the pitch) but leave the slowly varying modulations. This "no-pitch IRN" stimulus is referred to as IRNo. Results showed a widespread response to the spectrotemporal modulations across auditory cortex. When IRN was contrasted with IRNo rather than with Gaussian noise, the apparent effect of pitch was no longer statistically significant. Our findings raise the possibility that a cortical response unrelated to pitch could previously have been errantly attributed to pitch coding.


Subject(s)
Auditory Cortex/blood supply , Auditory Cortex/physiology , Brain Mapping , Discrimination, Psychological , Noise , Pitch Perception/physiology , Acoustic Stimulation , Adult , Analysis of Variance , Auditory Pathways/blood supply , Auditory Pathways/physiology , Female , Functional Laterality , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Middle Aged , Normal Distribution , Oxygen , Psychoacoustics , Young Adult
17.
Cereb Cortex ; 22(1): 132-43, 2012 Jan.
Article in English | MEDLINE | ID: mdl-21613467

ABSTRACT

There is an increasing interest to integrate electrophysiological and hemodynamic measures for characterizing spatial and temporal aspects of cortical processing. However, an informative combination of responses that have markedly different sensitivities to the underlying neural activity is not straightforward, especially in complex cognitive tasks. Here, we used parametric stimulus manipulation in magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI) recordings on the same subjects, to study effects of noise on processing of spoken words and environmental sounds. The added noise influenced MEG response strengths in the bilateral supratemporal auditory cortex, at different times for the different stimulus types. Specifically for spoken words, the effect of noise on the electrophysiological response was remarkably nonlinear. Therefore, we used the single-subject MEG responses to construct parametrization for fMRI data analysis and obtained notably higher sensitivity than with conventional stimulus-based parametrization. fMRI results showed that partly different temporal areas were involved in noise-sensitive processing of words and environmental sounds. These results indicate that cortical processing of sounds in background noise is stimulus specific in both timing and location and provide a new functionally meaningful platform for combining information obtained with electrophysiological and hemodynamic measures of brain function.


Subject(s)
Auditory Cortex/blood supply , Auditory Cortex/physiology , Auditory Perception/physiology , Brain Mapping , Evoked Potentials, Auditory/physiology , Acoustic Stimulation , Adult , Analysis of Variance , Environment , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Magnetoencephalography , Male , Noise , Oxygen/blood , Reaction Time/physiology , Sound , Vocabulary
18.
J Neurosci ; 31(4): 1479-88, 2011 Jan 26.
Article in English | MEDLINE | ID: mdl-21273432

ABSTRACT

In natural environments, a sound can be heard as stable despite the presence of other occasionally louder sounds. For example, when a portion in a voice is replaced by masking noise, the interrupted voice may still appear illusorily continuous. Previous research found that continuity illusions of simple interrupted sounds, such as tones, are accompanied by weaker activity in the primary auditory cortex (PAC) during the interruption than veridical discontinuity percepts of these sounds. Here, we studied whether continuity illusions of more natural and more complex sounds also emerge from this mechanism. We used psychophysics and functional magnetic resonance imaging in humans to measure simultaneously continuity ratings and blood oxygenation level-dependent activity to vowels that were partially replaced by masking noise. Consistent with previous results on tone continuity illusions, we found listeners' reports of more salient vowel continuity illusions associated with weaker activity in auditory cortex (compared with reports of veridical discontinuity percepts of physically identical stimuli). In contrast to the reduced activity to tone continuity illusions in PAC, this reduction was localized in the right anterolateral Heschl's gyrus, a region that corresponds more to the non-PAC. Our findings suggest that the ability to hear differently complex sounds as stable during other louder sounds may be attributable to a common suppressive mechanism that operates at different levels of sound representation in auditory cortex.


Subject(s)
Auditory Cortex/physiology , Noise , Pitch Discrimination , Adult , Auditory Cortex/blood supply , Female , Humans , Illusions , Magnetic Resonance Imaging , Male , Oxygen/blood , Psychophysics , Young Adult
19.
J Neurosci ; 31(35): 12638-43, 2011 Aug 31.
Article in English | MEDLINE | ID: mdl-21880924

ABSTRACT

Hearing loss is one of the most common complaints in adults over the age of 60 and a major contributor to difficulties in speech comprehension. To examine the effects of hearing ability on the neural processes supporting spoken language processing in humans, we used functional magnetic resonance imaging to monitor brain activity while older adults with age-normal hearing listened to sentences that varied in their linguistic demands. Individual differences in hearing ability predicted the degree of language-driven neural recruitment during auditory sentence comprehension in bilateral superior temporal gyri (including primary auditory cortex), thalamus, and brainstem. In a second experiment, we examined the relationship of hearing ability to cortical structural integrity using voxel-based morphometry, demonstrating a significant linear relationship between hearing ability and gray matter volume in primary auditory cortex. Together, these results suggest that even moderate declines in peripheral auditory acuity lead to a systematic downregulation of neural activity during the processing of higher-level aspects of speech, and may also contribute to loss of gray matter volume in primary auditory cortex. More generally, these findings support a resource-allocation framework in which individual differences in sensory ability help define the degree to which brain regions are recruited in service of a particular task.


Subject(s)
Auditory Cortex/physiopathology , Comprehension/physiology , Hearing Loss/pathology , Language , Speech/physiology , Aged , Audiometry/methods , Auditory Cortex/blood supply , Brain Mapping , Female , Hearing/physiology , Hearing Loss/physiopathology , Humans , Image Processing, Computer-Assisted/methods , Individuality , Magnetic Resonance Imaging/methods , Male , Middle Aged , Oxygen/blood
20.
J Cogn Neurosci ; 24(9): 1896-907, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22640390

ABSTRACT

Frequency modulation (FM) is an acoustic feature of nearly all complex sounds. Directional FM sweeps are especially pervasive in speech, music, animal vocalizations, and other natural sounds. Although the existence of FM-selective cells in the auditory cortex of animals has been documented, evidence in humans remains equivocal. Here we used multivariate pattern analysis to identify cortical selectivity for direction of a multitone FM sweep. This method distinguishes one pattern of neural activity from another within the same ROI, even when overall level of activity is similar, allowing for direct identification of FM-specialized networks. Standard contrast analysis showed that despite robust activity in auditory cortex, no clusters of activity were associated with up versus down sweeps. Multivariate pattern analysis classification, however, identified two brain regions as selective for FM direction, the right primary auditory cortex on the supratemporal plane and the left anterior region of the superior temporal gyrus. These findings are the first to directly demonstrate existence of FM direction selectivity in the human auditory cortex.


Subject(s)
Acoustic Stimulation , Auditory Cortex/physiology , Brain Mapping , Neural Pathways/physiology , Pattern Recognition, Physiological/physiology , Adult , Auditory Cortex/blood supply , Auditory Threshold , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Neural Pathways/blood supply , Oxygen/blood , Psychoacoustics , Reaction Time , Sound Localization/physiology , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL