RESUMEN
It has been proposed that the auditory cortex in the deaf humans might undergo task-specific reorganization. However, evidence remains scarce as previous experiments used only two very specific tasks (temporal processing and face perception) in visual modality. Here, congenitally deaf/hard of hearing and hearing women and men were enrolled in an fMRI experiment as we sought to fill this evidence gap in two ways. First, we compared activation evoked by a temporal processing task performed in two different modalities, visual and tactile. Second, we contrasted this task with a perceptually similar task that focuses on the spatial dimension. Additional control conditions consisted of passive stimulus observation. In line with the task specificity hypothesis, the auditory cortex in the deaf was activated by temporal processing in both visual and tactile modalities. This effect was selective for temporal processing relative to spatial discrimination. However, spatial processing also led to significant auditory cortex recruitment which, unlike temporal processing, occurred even during passive stimulus observation. We conclude that auditory cortex recruitment in the deaf and hard of hearing might involve interplay between task-selective and pluripotential mechanisms of cross-modal reorganization. Our results open several avenues for the investigation of the full complexity of the cross-modal plasticity phenomenon.SIGNIFICANCE STATEMENT Previous studies suggested that the auditory cortex in the deaf may change input modality (sound to vision) while keeping its function (e.g., rhythm processing). We investigated this hypothesis by asking deaf or hard of hearing and hearing adults to discriminate between temporally and spatially complex sequences in visual and tactile modalities. The results show that such function-specific brain reorganization, as has previously been demonstrated in the visual modality, also occurs for tactile processing. On the other hand, they also show that for some stimuli (spatial) the auditory cortex activates automatically, which is suggestive of a take-over by a different kind of cognitive function. The observed differences in processing of sequences might thus result from an interplay of task-specific and pluripotent plasticity.
Asunto(s)
Corteza Auditiva/fisiología , Trastornos de la Audición , Percepción del Tacto/fisiología , Percepción Visual/fisiología , Femenino , Humanos , Imagen por Resonancia Magnética , Masculino , Plasticidad Neuronal/fisiología , Estimulación Luminosa/métodos , Estimulación Física/métodos , Procesamiento Espacial/fisiología , Percepción del Tiempo/fisiologíaRESUMEN
There is strong evidence that neuronal bases for language processing are remarkably similar for sign and spoken languages. However, as meanings and linguistic structures of sign languages are coded in movement and space and decoded through vision, differences are also present, predominantly in occipitotemporal and parietal areas, such as superior parietal lobule (SPL). Whether the involvement of SPL reflects domain-general visuospatial attention or processes specific to sign language comprehension remains an open question. Here we conducted two experiments to investigate the role of SPL and the laterality of its engagement in sign language lexical processing. First, using unique longitudinal and between-group designs we mapped brain responses to sign language in hearing late learners and deaf signers. Second, using transcranial magnetic stimulation (TMS) in both groups we tested the behavioural relevance of SPL's engagement and its lateralisation during sign language comprehension. SPL activation in hearing participants was observed in the right hemisphere before and bilaterally after the sign language course. Additionally, after the course hearing learners exhibited greater activation in the occipital cortex and left SPL than deaf signers. TMS applied to the right SPL decreased accuracy in both hearing learners and deaf signers. Stimulation of the left SPL decreased accuracy only in hearing learners. Our results suggest that right SPL might be involved in visuospatial attention while left SPL might support phonological decoding of signs in non-proficient signers.