Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 11 de 11
Filtrar
2.
Elife ; 112022 09 07.
Artigo em Inglês | MEDLINE | ID: mdl-36070354

RESUMO

The ventral occipito-temporal cortex (VOTC) reliably encodes auditory categories in people born blind using a representational structure partially similar to the one found in vision (Mattioni et al.,2020). Here, using a combination of uni- and multivoxel analyses applied to fMRI data, we extend our previous findings, comprehensively investigating how early and late acquired blindness impact on the cortical regions coding for the deprived and the remaining senses. First, we show enhanced univariate response to sounds in part of the occipital cortex of both blind groups that is concomitant to reduced auditory responses in temporal regions. We then reveal that the representation of the sound categories in the occipital and temporal regions is more similar in blind subjects compared to sighted subjects. What could drive this enhanced similarity? The multivoxel encoding of the 'human voice' category that we observed in the temporal cortex of all sighted and blind groups is enhanced in occipital regions in blind groups , suggesting that the representation of vocal information is more similar between the occipital and temporal regions in blind compared to sighted individuals. We additionally show that blindness does not affect the encoding of the acoustic properties of our sounds (e.g. pitch, harmonicity) in occipital and in temporal regions but instead selectively alter the categorical coding of the voice category itself. These results suggest a functionally congruent interplay between the reorganization of occipital and temporal regions following visual deprivation, across the lifespan.


Assuntos
Cegueira , Lobo Temporal , Estimulação Acústica , Humanos , Lobo Occipital/diagnóstico por imagem , Lobo Occipital/fisiologia , Som , Lobo Temporal/diagnóstico por imagem , Lobo Temporal/fisiologia
3.
J Neurosci ; 42(23): 4652-4668, 2022 06 08.
Artigo em Inglês | MEDLINE | ID: mdl-35501150

RESUMO

hMT+/V5 is a region in the middle occipitotemporal cortex that responds preferentially to visual motion in sighted people. In cases of early visual deprivation, hMT+/V5 enhances its response to moving sounds. Whether hMT+/V5 contains information about motion directions and whether the functional enhancement observed in the blind is motion specific, or also involves sound source location, remains unsolved. Moreover, the impact of this cross-modal reorganization of hMT+/V5 on the regions typically supporting auditory motion processing, like the human planum temporale (hPT), remains equivocal. We used a combined functional and diffusion-weighted MRI approach and individual in-ear recordings to study the impact of early blindness on the brain networks supporting spatial hearing in male and female humans. Whole-brain univariate analysis revealed that the anterior portion of hMT+/V5 responded to moving sounds in sighted and blind people, while the posterior portion was selective to moving sounds only in blind participants. Multivariate decoding analysis revealed that the presence of motion direction and sound position information was higher in hMT+/V5 and lower in hPT in the blind group. While both groups showed axis-of-motion organization in hMT+/V5 and hPT, this organization was reduced in the hPT of blind people. Diffusion-weighted MRI revealed that the strength of hMT+/V5-hPT connectivity did not differ between groups, whereas the microstructure of the connections was altered by blindness. Our results suggest that the axis-of-motion organization of hMT+/V5 does not depend on visual experience, but that congenital blindness alters the response properties of occipitotemporal networks supporting spatial hearing in the sighted.SIGNIFICANCE STATEMENT Spatial hearing helps living organisms navigate their environment. This is certainly even more true in people born blind. How does blindness affect the brain network supporting auditory motion and sound source location? Our results show that the presence of motion direction and sound position information was higher in hMT+/V5 and lower in human planum temporale in blind relative to sighted people; and that this functional reorganization is accompanied by microstructural (but not macrostructural) alterations in their connections. These findings suggest that blindness alters cross-modal responses between connected areas that share the same computational goals.


Assuntos
Mapeamento Encefálico , Percepção de Movimento , Percepção Auditiva/fisiologia , Cegueira , Feminino , Humanos , Imageamento por Ressonância Magnética/métodos , Masculino , Percepção de Movimento/fisiologia
4.
J Neurosci ; 41(11): 2393-2405, 2021 03 17.
Artigo em Inglês | MEDLINE | ID: mdl-33514674

RESUMO

In humans, the occipital middle-temporal region (hMT+/V5) specializes in the processing of visual motion, while the planum temporale (hPT) specializes in auditory motion processing. It has been hypothesized that these regions might communicate directly to achieve fast and optimal exchange of multisensory motion information. Here we investigated, for the first time in humans (male and female), the presence of direct white matter connections between visual and auditory motion-selective regions using a combined fMRI and diffusion MRI approach. We found evidence supporting the potential existence of direct white matter connections between individually and functionally defined hMT+/V5 and hPT. We show that projections between hMT+/V5 and hPT do not overlap with large white matter bundles, such as the inferior longitudinal fasciculus and the inferior frontal occipital fasciculus. Moreover, we did not find evidence suggesting the presence of projections between the fusiform face area and hPT, supporting the functional specificity of hMT+/V5-hPT connections. Finally, the potential presence of hMT+/V5-hPT connections was corroborated in a large sample of participants (n = 114) from the human connectome project. Together, this study provides a first indication for potential direct occipitotemporal projections between hMT+/V5 and hPT, which may support the exchange of motion information between functionally specialized auditory and visual regions.SIGNIFICANCE STATEMENT Perceiving and integrating moving signal across the senses is arguably one of the most important perceptual skills for the survival of living organisms. In order to create a unified representation of movement, the brain must therefore integrate motion information from separate senses. Our study provides support for the potential existence of direct connections between motion-selective regions in the occipital/visual (hMT+/V5) and temporal/auditory (hPT) cortices in humans. This connection could represent the structural scaffolding for the rapid and optimal exchange and integration of multisensory motion information. These findings suggest the existence of computationally specific pathways that allow information flow between areas that share a similar computational goal.


Assuntos
Percepção Auditiva/fisiologia , Percepção de Movimento/fisiologia , Rede Nervosa/fisiologia , Percepção Visual/fisiologia , Adulto , Animais , Mapeamento Encefálico , Conectoma , Imagem de Tensor de Difusão , Reconhecimento Facial/fisiologia , Feminino , Humanos , Processamento de Imagem Assistida por Computador , Imageamento por Ressonância Magnética , Masculino , Rede Nervosa/diagnóstico por imagem , Lobo Occipital/diagnóstico por imagem , Lobo Occipital/fisiologia , Lobo Temporal/diagnóstico por imagem , Lobo Temporal/fisiologia , Córtex Visual/diagnóstico por imagem , Córtex Visual/fisiologia , Substância Branca/diagnóstico por imagem , Substância Branca/fisiologia , Adulto Jovem
5.
Curr Biol ; 30(12): 2289-2299.e8, 2020 06 22.
Artigo em Inglês | MEDLINE | ID: mdl-32442465

RESUMO

The human occipito-temporal region hMT+/V5 is well known for processing visual motion direction. Here, we demonstrate that hMT+/V5 also represents the direction of auditory motion in a format partially aligned with the one used to code visual motion. We show that auditory and visual motion directions can be reliably decoded in individually localized hMT+/V5 and that motion directions in one modality can be predicted from the activity patterns elicited by the other modality. Despite shared motion-direction information across the senses, vision and audition, however, overall produce opposite voxel-wise responses in hMT+/V5. Our results reveal a multifaced representation of multisensory motion signals in hMT+/V5 and have broader implications for our understanding of how we consider the division of sensory labor between brain regions dedicated to a specific perceptual function.


Assuntos
Percepção Auditiva/fisiologia , Percepção de Movimento/fisiologia , Lobo Temporal/fisiologia , Percepção Visual/fisiologia , Adulto , Feminino , Humanos , Masculino , Adulto Jovem
6.
Elife ; 92020 02 28.
Artigo em Inglês | MEDLINE | ID: mdl-32108572

RESUMO

Is vision necessary for the development of the categorical organization of the Ventral Occipito-Temporal Cortex (VOTC)? We used fMRI to characterize VOTC responses to eight categories presented acoustically in sighted and early blind individuals, and visually in a separate sighted group. We observed that VOTC reliably encodes sound categories in sighted and blind people using a representational structure and connectivity partially similar to the one found in vision. Sound categories were, however, more reliably encoded in the blind than the sighted group, using a representational format closer to the one found in vision. Crucially, VOTC in blind represents the categorical membership of sounds rather than their acoustic features. Our results suggest that sounds trigger categorical responses in the VOTC of congenitally blind and sighted people that partially match the topography and functional profile of the visual response, despite qualitative nuances in the categorical organization of VOTC between modalities and groups.


The world is full of rich and dynamic visual information. To avoid information overload, the human brain groups inputs into categories such as faces, houses, or tools. A part of the brain called the ventral occipito-temporal cortex (VOTC) helps categorize visual information. Specific parts of the VOTC prefer different types of visual input; for example, one part may tend to respond more to faces, whilst another may prefer houses. However, it is not clear how the VOTC characterizes information. One idea is that similarities between certain types of visual information may drive how information is organized in the VOTC. For example, looking at faces requires using central vision, while looking at houses requires using peripheral vision. Furthermore, all faces have a roundish shape while houses tend to have a more rectangular shape. Another possibility, however, is that the categorization of different inputs cannot be explained just by vision, and is also be driven by higher-level aspects of each category. For instance, how humans use or interact with something may also influence how an input is categorized. If categories are established depending (at least partially) on these higher-level aspects, rather than purely through visual likeness, it is likely that the VOTC would respond similarly to both sounds and images representing these categories. Now, Mattioni et al. have tested how individuals with and without sight respond to eight different categories of information to find out whether or not categorization is driven purely by visual likeness. Each category was presented to participants using sounds while measuring their brain activity. In addition, a group of participants who could see were also presented with the categories visually. Mattioni et al. then compared what happened in the VOTC of the three groups ­ sighted people presented with sounds, blind people presented with sounds, and sighted people presented with images ­ in response to each category. The experiment revealed that the VOTC organizes both auditory and visual information in a similar way. However, there were more similarities between the way blind people categorized auditory information and how sighted people categorized visual information than between how sighted people categorized each type of input. Mattioni et al. also found that the region of the VOTC that responds to inanimate objects massively overlapped across the three groups, whereas the part of the VOTC that responds to living things was more variable. These findings suggest that the way that the VOTC organizes information is, at least partly, independent from vision. The experiments also provide some information about how the brain reorganizes in people who are born blind. Further studies may reveal how differences in the VOTC of people with and without sight affect regions typically associated with auditory categorization, and potentially explain how the brain reorganizes in people who become blind later in life.


Assuntos
Percepção Auditiva , Cegueira/fisiopatologia , Lobo Occipital/fisiopatologia , Lobo Temporal/fisiopatologia , Estimulação Acústica , Estudos de Casos e Controles , Humanos
7.
J Neurosci ; 39(12): 2208-2220, 2019 03 20.
Artigo em Inglês | MEDLINE | ID: mdl-30651333

RESUMO

The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is, however, poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to sounds moving left, right, up, and down as well as static sounds. Whole-brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human planum temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were, however, significantly distinct. Altogether, our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.SIGNIFICANCE STATEMENT Compared with what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human planum temporale (hPT) and that they rely on partially shared pattern geometries. Our study, therefore, sheds important new light on how computing the location or direction of sounds is implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a "preferred axis of motion" organization, reminiscent of the coding mechanisms typically observed in the occipital middle-temporal cortex (hMT+/V5) region for computing visual motion.


Assuntos
Córtex Auditivo/fisiologia , Localização de Som/fisiologia , Estimulação Acústica , Adulto , Mapeamento Encefálico , Feminino , Humanos , Imageamento por Ressonância Magnética , Masculino , Modelos Neurológicos , Adulto Jovem
8.
Elife ; 72018 01 17.
Artigo em Inglês | MEDLINE | ID: mdl-29338838

RESUMO

The occipital cortex of early blind individuals (EB) activates during speech processing, challenging the notion of a hard-wired neurobiology of language. But, at what stage of speech processing do occipital regions participate in EB? Here we demonstrate that parieto-occipital regions in EB enhance their synchronization to acoustic fluctuations in human speech in the theta-range (corresponding to syllabic rate), irrespective of speech intelligibility. Crucially, enhanced synchronization to the intelligibility of speech was selectively observed in primary visual cortex in EB, suggesting that this region is at the interface between speech perception and comprehension. Moreover, EB showed overall enhanced functional connectivity between temporal and occipital cortices that are sensitive to speech intelligibility and altered directionality when compared to the sighted group. These findings suggest that the occipital cortex of the blind adopts an architecture that allows the tracking of speech material, and therefore does not fully abstract from the reorganized sensory inputs it receives.


Assuntos
Cegueira , Sincronização Cortical , Neurônios/fisiologia , Lobo Occipital/fisiologia , Percepção da Fala , Adulto , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Adulto Jovem
9.
Cortex ; 83: 271-9, 2016 10.
Artigo em Inglês | MEDLINE | ID: mdl-27622641

RESUMO

Several studies suggest that serial order in working memory (WM) is grounded on space. For a list of ordered items held in WM, items at the beginning of the list are associated with the left side of space and items at the end of the list with the right side. This suggests that maintaining items in verbal WM is performed in strong analogy to writing these items down on a physical whiteboard for later consultation (The Mental Whiteboard Hypothesis). What drives this spatial mapping of ordered series in WM remains poorly understood. In the present study we tested whether visual experience is instrumental in establishing the link between serial order in WM and spatial processing. We tested early blind (EB), late blind (LB) and sighted individuals in an auditory WM task. Replicating previous studies, left-key responses were faster for early items in the list whereas later items facilitated right-key responses in the sighted group. The same effect was observed in LB individuals. In contrast, EB participants did not show any association between space and serial position in WM. These results suggest that early visual experience plays a critical role in linking ordered items in WM and spatial representations. The analogical spatial structure of WM may depend in part on the actual experience of using spatially organized devices (e.g., notes, whiteboards) to offload WM. These practices are largely precluded to EB individuals, who instead rely to mnemonic devices that are less spatially organized (e.g., recordings, vocal notes). The way we habitually organize information in the external world may bias the way we organize information in our WM.


Assuntos
Percepção Auditiva/fisiologia , Cegueira/fisiopatologia , Memória de Curto Prazo/fisiologia , Fala/fisiologia , Adulto , Humanos , Tempo de Reação/fisiologia
10.
Front Hum Neurosci ; 9: 656, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-26733842

RESUMO

Recent studies have revealed that human position sense relies on a massively distorted representation of hand size and shape. By comparing the judged location of landmarks on an occluded hand, Longo and Haggard (2010) constructed implicit perceptual maps of represented hand structure, showing large underestimation of finger length and overestimation of hand width. Here, we investigated the contribution of two potential sources of distortions to such effects: perceptual distortions reflecting spatial warping of the representation of bodily tissue itself, perhaps reflecting distortions of somatotopic cortical maps, and conceptual distortions reflecting mistaken beliefs about the locations of different landmarks within the body. In Experiment 1 we compared distorted hand maps to a task in which participants explicitly judged the location of their knuckles in a hand silhouette. The results revealed the conceptual distortions are responsible for at least part of the underestimation of finger length, but cannot explain overestimation of hand width. Experiment 2 compared distortions of the participant's own hand based on position sense with a prosthetic hand based on visual memory. Underestimation of finger length was found for both hands, providing further evidence that it reflects a conceptual distortion. In contrast, overestimation of hand width was specific to representation of the participant's own hand, confirming it reflects a perceptual distortion. Together, these results suggest that distorted body representations do not reflect a single underlying cause. Rather, both perceptual and conceptual distortions contribute to the overall configuration of the hand representation.

11.
Acta Psychol (Amst) ; 153: 60-5, 2014 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-25305592

RESUMO

The use of position sense to perceive the external spatial location of the body requires that immediate proprioceptive afferent signals be combined with stored representations of body size and shape. Longo and Haggard (2010) developed a method to isolate and measure this representation in which participants judge the location of several landmarks on their occluded hand. The relative location of judgements is used to construct a perceptual map of hand shape. Studies using this paradigm have revealed large, and highly stereotyped, distortions of the hand, which is represented as wider than it actually is and with shortened fingers. Previous studies using this paradigm have cued participants to respond by giving verbal labels of the knuckles and fingertips. A recent study has shown differential effects of verbal and tactile cueing of localisation judgements about bodily landmarks (Cardinali et al., 2011). The present study therefore investigated implicit hand maps measuring through localisation judgements made in response to verbal labels and tactile stimuli applied to the same landmarks. The characteristic set of distortions of hand size and shape were clearly apparent in both conditions, indicating that the distortions reported previously are not an artefact of the use of verbal cues. However, there were also differences in the magnitude of distortions between conditions, suggesting that the use of verbal cues may alter the representation of the body underlying position sense.


Assuntos
Imagem Corporal , Mãos/fisiologia , Propriocepção/fisiologia , Percepção Espacial/fisiologia , Percepção do Tato/fisiologia , Adolescente , Adulto , Idoso , Sinais (Psicologia) , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA