RESUMO
Language and music are two human-unique capacities whose relationship remains debated. Some have argued for overlap in processing mechanisms, especially for structure processing. Such claims often concern the inferior frontal component of the language system located within "Broca's area." However, others have failed to find overlap. Using a robust individual-subject fMRI approach, we examined the responses of language brain regions to music stimuli, and probed the musical abilities of individuals with severe aphasia. Across 4 experiments, we obtained a clear answer: music perception does not engage the language system, and judgments about music structure are possible even in the presence of severe damage to the language network. In particular, the language regions' responses to music are generally low, often below the fixation baseline, and never exceed responses elicited by nonmusic auditory conditions, like animal sounds. Furthermore, the language regions are not sensitive to music structure: they show low responses to both intact and structure-scrambled music, and to melodies with vs. without structural violations. Finally, in line with past patient investigations, individuals with aphasia, who cannot judge sentence grammaticality, perform well on melody well-formedness judgments. Thus, the mechanisms that process structure in language do not appear to process music, including music syntax.
Assuntos
Afasia , Música , Humanos , Área de Broca , Idioma , Imageamento por Ressonância Magnética , Mapeamento Encefálico , PercepçãoRESUMO
Prior studies have observed selective neural responses in the adult human auditory cortex to music and speech that cannot be explained by the differing lower-level acoustic properties of these stimuli. Does infant cortex exhibit similarly selective responses to music and speech shortly after birth? To answer this question, we attempted to collect functional magnetic resonance imaging (fMRI) data from 45 sleeping infants (2.0- to 11.9-weeks-old) while they listened to monophonic instrumental lullabies and infant-directed speech produced by a mother. To match acoustic variation between music and speech sounds we (1) recorded music from instruments that had a similar spectral range as female infant-directed speech, (2) used a novel excitation-matching algorithm to match the cochleagrams of music and speech stimuli, and (3) synthesized "model-matched" stimuli that were matched in spectrotemporal modulation statistics to (yet perceptually distinct from) music or speech. Of the 36 infants we collected usable data from, 19 had significant activations to sounds overall compared to scanner noise. From these infants, we observed a set of voxels in non-primary auditory cortex (NPAC) but not in Heschl's Gyrus that responded significantly more to music than to each of the other three stimulus types (but not significantly more strongly than to the background scanner noise). In contrast, our planned analyses did not reveal voxels in NPAC that responded more to speech than to model-matched speech, although other unplanned analyses did. These preliminary findings suggest that music selectivity arises within the first month of life. A video abstract of this article can be viewed at https://youtu.be/c8IGFvzxudk. RESEARCH HIGHLIGHTS: Responses to music, speech, and control sounds matched for the spectrotemporal modulation-statistics of each sound were measured from 2- to 11-week-old sleeping infants using fMRI. Auditory cortex was significantly activated by these stimuli in 19 out of 36 sleeping infants. Selective responses to music compared to the three other stimulus classes were found in non-primary auditory cortex but not in nearby Heschl's Gyrus. Selective responses to speech were not observed in planned analyses but were observed in unplanned, exploratory analyses.
Assuntos
Córtex Auditivo , Música , Percepção da Fala , Adulto , Humanos , Lactente , Feminino , Estimulação Acústica , Percepção Auditiva/fisiologia , Córtex Auditivo/fisiologia , Ruído , Imageamento por Ressonância Magnética , Percepção da Fala/fisiologiaRESUMO
Ventral visual cortex contains specialized regions for particular object categories, but little is known about how these regions interact during object recognition. Here we examine how the face-selective fusiform gyrus (FG) and the scene-selective parahippocampal cortex (PHC) interact with each other and with the rest of the brain during different visual tasks. To assess these interactions, we developed a novel approach for identifying patterns of connectivity associated with specific task sets, independent of stimulus-evoked responses. We tested whether this "background connectivity" between the FG and PHC was modulated when subjects engaged in face and scene processing tasks. In contrast to what would be predicted from biased competition or intrinsic activity accounts, we found that the strength of FG-PHC background connectivity depended on which category was task relevant: connectivity increased when subjects attended to scenes (irrespective of whether a competing face was present) and decreased when subjects attended to faces (irrespective of competing scenes). We further discovered that posterior occipital cortex was correlated selectively with the FG during face tasks and the PHC during scene tasks. These results suggest that category specificity exists not only in which regions respond most strongly but also in how these and other regions interact.
Assuntos
Mapeamento Encefálico , Formação de Conceito/fisiologia , Giro Para-Hipocampal/fisiologia , Reconhecimento Visual de Modelos/fisiologia , Córtex Visual/fisiologia , Análise de Variância , Face , Feminino , Humanos , Processamento de Imagem Assistida por Computador , Modelos Lineares , Imageamento por Ressonância Magnética , Masculino , Testes Neuropsicológicos , Oxigênio/sangue , Giro Para-Hipocampal/irrigação sanguínea , Estimulação Luminosa , Tempo de Reação/fisiologia , Reconhecimento Psicológico , Fatores de Tempo , Córtex Visual/irrigação sanguínea , Vias Visuais/fisiologia , Adulto JovemRESUMO
Faces activate specific brain regions in fMRI, including the fusiform gyrus (FG) and the posterior superior temporal sulcus (pSTS). The fact that the FG and pSTS are frequently co-activated suggests that they may interact synergistically in a distributed face processing network. Alternatively, the functions implemented by these regions may be encapsulated from each other. It has proven difficult to evaluate these two accounts during visual processing of face stimuli. However, if the FG and pSTS interact during face processing, the substrate for such interactions may be apparent in a correlation of the BOLD timeseries from these two regions during periods of rest when no faces are present. To examine face-specific resting correlations, we developed a new partial functional connectivity approach in which we removed variance from the FG that was shared with other category-selective and control regions. The remaining face-specific FG resting variance was then used to predict resting signals throughout the brain. In two experiments, we observed face-specific resting functional connectivity between FG and pSTS, and importantly, these correlations overlapped precisely with the face-specific pSTS region obtained from independent localizer runs. Additional region-of-interest and pattern analyses confirmed that the FG-pSTS resting correlations were face-specific. These findings support a model in which face processing is distributed among a finite number of connected, but nevertheless face-specialized regions. The discovery of category-specific interactions in the absence of visual input suggests that resting networks may provide a latent foundation for task processing.