Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 12 de 12
Filtrar
1.
Neuroimage ; 209: 116411, 2020 04 01.
Artigo em Inglês | MEDLINE | ID: mdl-31857205

RESUMO

Deaf late signers provide a unique perspective on the impact of impoverished early language exposure on the neurobiology of language: insights that cannot be gained from research with hearing people alone. Here we contrast the effect of age of sign language acquisition in hearing and congenitally deaf adults to examine the potential impact of impoverished early language exposure on the neural systems supporting a language learnt later in life. We collected fMRI data from deaf and hearing proficient users (N â€‹= â€‹52) of British Sign Language (BSL), who learnt BSL either early (native) or late (after the age of 15 years) whilst they watched BSL sentences or strings of meaningless nonsense signs. There was a main effect of age of sign language acquisition (late â€‹> â€‹early) across deaf and hearing signers in the occipital segment of the left intraparietal sulcus. This finding suggests that late learners of sign language may rely on visual processing more than early learners, when processing both linguistic and nonsense sign input - regardless of hearing status. Region-of-interest analyses in the posterior superior temporal cortices (STC) showed an effect of age of sign language acquisition that was specific to deaf signers. In the left posterior STC, activation in response to signed sentences was greater in deaf early signers than deaf late signers. Importantly, responses in the left posterior STC in hearing early and late signers did not differ, and were similar to those observed in deaf early signers. These data lend further support to the argument that robust early language experience, whether signed or spoken, is necessary for left posterior STC to show a 'native-like' response to a later learnt language.


Assuntos
Mapeamento Encefálico , Surdez/fisiopatologia , Desenvolvimento da Linguagem , Idioma , Plasticidade Neuronal/fisiologia , Língua de Sinais , Lobo Temporal/fisiologia , Adulto , Fatores Etários , Surdez/congênito , Humanos , Imageamento por Ressonância Magnética , Pessoa de Meia-Idade , Reconhecimento Visual de Modelos/fisiologia , Lobo Temporal/diagnóstico por imagem , Lobo Temporal/fisiopatologia , Adulto Jovem
2.
J Neurosci ; 37(39): 9564-9573, 2017 09 27.
Artigo em Inglês | MEDLINE | ID: mdl-28821674

RESUMO

To investigate how hearing status, sign language experience, and task demands influence functional responses in the human superior temporal cortices (STC) we collected fMRI data from deaf and hearing participants (male and female), who either acquired sign language early or late in life. Our stimuli in all tasks were pictures of objects. We varied the linguistic and visuospatial processing demands in three different tasks that involved decisions about (1) the sublexical (phonological) structure of the British Sign Language (BSL) signs for the objects, (2) the semantic category of the objects, and (3) the physical features of the objects.Neuroimaging data revealed that in participants who were deaf from birth, STC showed increased activation during visual processing tasks. Importantly, this differed across hemispheres. Right STC was consistently activated regardless of the task whereas left STC was sensitive to task demands. Significant activation was detected in the left STC only for the BSL phonological task. This task, we argue, placed greater demands on visuospatial processing than the other two tasks. In hearing signers, enhanced activation was absent in both left and right STC during all three tasks. Lateralization analyses demonstrated that the effect of deafness was more task-dependent in the left than the right STC whereas it was more task-independent in the right than the left STC. These findings indicate how the absence of auditory input from birth leads to dissociable and altered functions of left and right STC in deaf participants.SIGNIFICANCE STATEMENT Those born deaf can offer unique insights into neuroplasticity, in particular in regions of superior temporal cortex (STC) that primarily respond to auditory input in hearing people. Here we demonstrate that in those deaf from birth the left and the right STC have altered and dissociable functions. The right STC was activated regardless of demands on visual processing. In contrast, the left STC was sensitive to the demands of visuospatial processing. Furthermore, hearing signers, with the same sign language experience as the deaf participants, did not activate the STCs. Our data advance current understanding of neural plasticity by determining the differential effects that hearing status and task demands can have on left and right STC function.


Assuntos
Percepção Auditiva , Surdez/fisiopatologia , Lateralidade Funcional , Memória de Curto Prazo , Língua de Sinais , Lobo Temporal/fisiologia , Adulto , Mapeamento Encefálico , Estudos de Casos e Controles , Feminino , Humanos , Imageamento por Ressonância Magnética , Masculino , Pessoa de Meia-Idade , Semântica , Lobo Temporal/fisiopatologia , Percepção Visual
3.
Brain ; 132(Pt 7): 1928-40, 2009 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-19467990

RESUMO

Hearing developmental dyslexics and profoundly deaf individuals both have difficulties processing the internal structure of words (phonological processing) and learning to read. In hearing non-impaired readers, the development of phonological representations depends on audition. In hearing dyslexics, many argue, auditory processes may be impaired. In congenitally profoundly deaf individuals, auditory speech processing is essentially absent. Two separate literatures have previously reported enhanced activation in the left inferior frontal gyrus in both deaf and dyslexic adults when contrasted with hearing non-dyslexics during reading or phonological tasks. Here, we used a rhyme judgement task to compare adults from these two special populations to a hearing non-dyslexic control group. All groups were matched on non-verbal intelligence quotient, reading age and rhyme performance. Picture stimuli were used since this requires participants to generate their own phonological representations, rather than have them partially provided via text. By testing well-matched groups of participants on the same task, we aimed to establish whether previous literatures reporting differences between individuals with and without phonological processing difficulties have identified the same regions of differential activation in these two distinct populations. The data indicate greater activation in the deaf and dyslexic groups than in the hearing non-dyslexic group across a large portion of the left inferior frontal gyrus. This includes the pars triangularis, extending superiorly into the middle frontal gyrus and posteriorly to include the pars opercularis, and the junction with the ventral precentral gyrus. Within the left inferior frontal gyrus, there was variability between the two groups with phonological processing difficulties. The superior posterior tip of the left pars opercularis, extending into the precentral gyrus, was activated to a greater extent by deaf than dyslexic participants, whereas the superior posterior portion of the pars triangularis extending into the ventral pars opercularis, was activated to a greater extent by dyslexic than deaf participants. Whether these regions play differing roles in compensating for poor phonological processing is not clear. However, we argue that our main finding of greater inferior frontal gyrus activation in both groups with phonological processing difficulties in contrast to controls suggests greater reliance on the articulatory component of speech during phonological processing when auditory processes are absent (deaf group) or impaired (dyslexic group). Thus, the brain appears to develop a similar solution to a processing problem that has different antecedents in these two populations.


Assuntos
Surdez/fisiopatologia , Dislexia/fisiopatologia , Lobo Frontal/fisiopatologia , Adolescente , Adulto , Mapeamento Encefálico/métodos , Estudos de Casos e Controles , Surdez/congênito , Surdez/psicologia , Dislexia/psicologia , Feminino , Humanos , Imageamento por Ressonância Magnética/métodos , Masculino , Pessoa de Meia-Idade , Fonética , Estimulação Luminosa/métodos , Tempo de Reação/fisiologia , Comportamento Verbal/fisiologia , Adulto Jovem
4.
Neuropsychologia ; 46(5): 1233-41, 2008 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-18249420

RESUMO

This fMRI study explored the functional neural organisation of seen speech in congenitally deaf native signers and hearing non-signers. Both groups showed extensive activation in perisylvian regions for speechreading words compared to viewing the model at rest. In contrast to earlier findings, activation in left middle and posterior portions of superior temporal cortex, including regions within the lateral sulcus and the superior and middle temporal gyri, was greater for deaf than hearing participants. This activation pattern survived covarying for speechreading skill, which was better in deaf than hearing participants. Furthermore, correlational analysis showed that regions of activation related to speechreading skill varied with the hearing status of the observers. Deaf participants showed a positive correlation between speechreading skill and activation in the middle/posterior superior temporal cortex. In hearing participants, however, more posterior and inferior temporal activation (including fusiform and lingual gyri) was positively correlated with speechreading skill. Together, these findings indicate that activation in the left superior temporal regions for silent speechreading can be modulated by both hearing status and speechreading skill.


Assuntos
Surdez/fisiopatologia , Audição/fisiologia , Leitura Labial , Rede Nervosa/fisiopatologia , Adolescente , Adulto , Análise de Variância , Córtex Auditivo/fisiopatologia , Córtex Cerebral/fisiologia , Interpretação Estatística de Dados , Imagem Ecoplanar , Feminino , Humanos , Processamento de Imagem Assistida por Computador , Imageamento por Ressonância Magnética , Masculino , Pessoa de Meia-Idade , Estimulação Luminosa , Língua de Sinais
5.
Brain Lang ; 150: 45-53, 2015 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-26335996

RESUMO

Here we adopt a novel strategy to investigate phonological assembly. Participants performed a visual lexical decision task in English in which the letters in words and letterstrings were delivered either sequentially (promoting phonological assembly) or simultaneously (not promoting phonological assembly). A region of interest analysis confirmed that regions previously associated with phonological assembly, in studies contrasting different word types (e.g. words versus pseudowords), were also identified using our novel task that controls for a number of confounding variables. Specifically, the left pars opercularis, the superior part of the ventral precentral gyrus and the supramarginal gyrus were all recruited more during sequential delivery than simultaneous delivery, even when various psycholinguistic characteristics of the stimuli were controlled. This suggests that sequential delivery of orthographic stimuli is a useful tool to explore how readers, with various levels of proficiency, use sublexical phonological processing during visual word recognition.


Assuntos
Mapeamento Encefálico , Encéfalo/fisiologia , Fonética , Leitura , Adolescente , Adulto , Feminino , Lobo Frontal/fisiologia , Humanos , Imageamento por Ressonância Magnética , Masculino , Lobo Parietal/fisiologia , Estimulação Luminosa , Psicolinguística , Percepção Visual/fisiologia , Adulto Jovem
7.
Brain Lang ; 126(1): 1-7, 2013 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-23644583

RESUMO

It is possible to comprehend speech and discriminate languages by viewing a speaker's articulatory movements. Transcranial magnetic stimulation studies have shown that viewing speech enhances excitability in the articulatory motor cortex. Here, we investigated the specificity of this enhanced motor excitability in native and non-native speakers of English. Both groups were able to discriminate between speech movements related to a known (i.e., English) and unknown (i.e., Hebrew) language. The motor excitability was higher during observation of a known language than an unknown language or non-speech mouth movements, suggesting that motor resonance is enhanced specifically during observation of mouth movements that convey linguistic information. Surprisingly, however, the excitability was equally high during observation of a static face. Moreover, the motor excitability did not differ between native and non-native speakers. These findings suggest that the articulatory motor cortex processes several kinds of visual cues during speech communication.


Assuntos
Potencial Evocado Motor/fisiologia , Córtex Motor/fisiologia , Percepção da Fala/fisiologia , Percepção Visual/fisiologia , Adulto , Feminino , Humanos , Idioma , Leitura Labial , Masculino , Estimulação Magnética Transcraniana , Adulto Jovem
8.
Brain Lang ; 112(2): 129-34, 2010 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-20042233

RESUMO

Studies of spoken and signed language processing reliably show involvement of the posterior superior temporal cortex. This region is also reliably activated by observation of meaningless oral and manual actions. In this study we directly compared the extent to which activation in posterior superior temporal cortex is modulated by linguistic knowledge irrespective of differences in language form. We used a novel cross-linguistic approach in two groups of volunteers who differed in their language experience. Using fMRI, we compared deaf native signers of British Sign Language (BSL), who were also proficient speechreaders of English (i.e., two languages) with hearing people who could speechread English, but knew no BSL (i.e., one language). Both groups were presented with BSL signs and silently spoken English words, and were required to respond to a signed or spoken target. The interaction of group and condition revealed activation in the superior temporal cortex, bilaterally, focused in the posterior superior temporal gyri (pSTG, BA 42/22). In hearing people, these regions were activated more by speech than by sign, but in deaf respondents they showed similar levels of activation for both language forms - suggesting that posterior superior temporal regions are highly sensitive to language knowledge irrespective of the mode of delivery of the stimulus material.


Assuntos
Compreensão/fisiologia , Surdez/fisiopatologia , Linguística , Leitura Labial , Língua de Sinais , Lobo Temporal/fisiologia , Mapeamento Encefálico , Humanos
9.
J Deaf Stud Deaf Educ ; 13(1): 3-20, 2008.
Artigo em Inglês | MEDLINE | ID: mdl-17602162

RESUMO

How are signed languages processed by the brain? This review briefly outlines some basic principles of brain structure and function and the methodological principles and techniques that have been used to investigate this question. We then summarize a number of different studies exploring brain activity associated with sign language processing especially as compared to speech processing. We focus on lateralization: is signed language lateralized to the left hemisphere (LH) of native signers, just as spoken language is lateralized to the LH of native speakers, or could sign processing involve the right hemisphere to a greater extent than speech processing? Experiments that have addressed this question are described, and some problems in obtaining a clear answer are outlined.


Assuntos
Encéfalo/fisiologia , Língua de Sinais , Encéfalo/anatomia & histologia , Lesões Encefálicas/fisiopatologia , Eletroencefalografia , Potenciais Evocados/fisiologia , Lateralidade Funcional/fisiologia , Humanos , Acontecimentos que Mudam a Vida , Imageamento por Ressonância Magnética , Plasticidade Neuronal/fisiologia
10.
Neuroimage ; 40(3): 1369-79, 2008 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-18282770

RESUMO

Just as words can rhyme, the signs of a signed language can share structural properties, such as location. Linguistic description at this level is termed phonology. We report that a left-lateralised fronto-parietal network is engaged during phonological similarity judgements made in both English (rhyme) and British Sign Language (BSL; location). Since these languages operate in different modalities, these data suggest that the neural network supporting phonological processing is, to some extent, supramodal. Activation within this network was however modulated by language (BSL/English), hearing status (deaf/hearing), and age of BSL acquisition (native/non-native). The influence of language and hearing status suggests an important role for the posterior portion of the left inferior frontal gyrus in speech-based phonological processing in deaf people. This, we suggest, is due to increased reliance on the articulatory component of speech when the auditory component is absent. With regard to age of first language acquisition, non-native signers activated the left inferior frontal gyrus more than native signers during the BSL task, and also during the task performed in English, which both groups acquired late. This is the first neuroimaging demonstration that age of first language acquisition has implications not only for the neural systems supporting the first language, but also for networks supporting languages learned subsequently.


Assuntos
Surdez/psicologia , Idioma , Aprendizagem/fisiologia , Língua de Sinais , Adulto , Envelhecimento/fisiologia , Envelhecimento/psicologia , Interpretação Estatística de Dados , Feminino , Audição/fisiologia , Humanos , Testes de Inteligência , Imageamento por Ressonância Magnética , Masculino , Rede Nervosa/fisiologia , Testes Neuropsicológicos , Estimulação Luminosa , Leitura
11.
J Cogn Neurosci ; 20(7): 1220-34, 2008 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-18284353

RESUMO

Spoken languages use one set of articulators -- the vocal tract, whereas signed languages use multiple articulators, including both manual and facial actions. How sensitive are the cortical circuits for language processing to the particular articulators that are observed? This question can only be addressed with participants who use both speech and a signed language. In this study, we used functional magnetic resonance imaging to compare the processing of speechreading and sign processing in deaf native signers of British Sign Language (BSL) who were also proficient speechreaders. The following questions were addressed: To what extent do these different language types rely on a common brain network? To what extent do the patterns of activation differ? How are these networks affected by the articulators that languages use? Common peri-sylvian regions were activated both for speechreading English words and for BSL signs. Distinctive activation was also observed reflecting the language form. Speechreading elicited greater activation in the left mid-superior temporal cortex than BSL, whereas BSL processing generated greater activation at the temporo-parieto-occipital junction in both hemispheres. We probed this distinction further within BSL, where manual signs can be accompanied by different types of mouth action. BSL signs with speech-like mouth actions showed greater superior temporal activation, whereas signs made with non-speech-like mouth actions showed more activation in posterior and inferior temporal regions. Distinct regions within the temporal cortex are not only differentially sensitive to perception of the distinctive articulators for speech and for sign but also show sensitivity to the different articulators within the (signed) language.


Assuntos
Córtex Cerebral/fisiologia , Mãos/fisiologia , Leitura Labial , Boca , Semântica , Língua de Sinais , Adulto , Análise de Variância , Mapeamento Encefálico , Córtex Cerebral/irrigação sanguínea , Feminino , Lateralidade Funcional , Humanos , Processamento de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos , Masculino , Oxigênio/sangue , Valor Preditivo dos Testes , Tempo de Reação/fisiologia
12.
Neuroimage ; 35(3): 1287-302, 2007 04 15.
Artigo em Inglês | MEDLINE | ID: mdl-17363278

RESUMO

In fingerspelling, different hand configurations are used to represent the different letters of the alphabet. Signers use this method of representing written language to fill lexical gaps in a signed language. Using fMRI, we compared cortical networks supporting the perception of fingerspelled, signed, written, and pictorial stimuli in deaf native signers of British Sign Language (BSL). In order to examine the effects of linguistic knowledge, hearing participants who knew neither fingerspelling nor a signed language were also tested. All input forms activated a left fronto-temporal network, including portions of left inferior temporal and mid-fusiform gyri, in both groups. To examine the extent to which activation in this region was influenced by orthographic structure, two contrasts of orthographic and non-orthographic stimuli were made: one using static stimuli (text vs. pictures), the other using dynamic stimuli (fingerspelling vs. signed language). Greater activation in left and right inferior temporal and mid-fusiform gyri was found for pictures than text in both deaf and hearing groups. In the fingerspelling vs. signed language contrast, a significant interaction indicated locations within the left and right mid-fusiform gyri. This showed greater activation for fingerspelling than signed language in deaf but not hearing participants. These results are discussed in light of recent proposals that the mid-fusiform gyrus may act as an integration region, mediating between visual input and higher-order stimulus properties.


Assuntos
Compreensão , Surdez/fisiopatologia , Lobo Occipital/fisiopatologia , Língua de Sinais , Lobo Temporal/fisiopatologia , Percepção Visual , Adolescente , Adulto , Mapeamento Encefálico , Feminino , Humanos , Masculino , Pessoa de Meia-Idade
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA