Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters

Database
Language
Affiliation country
Publication year range
1.
Brain Lang ; 225: 105058, 2022 02.
Article in English | MEDLINE | ID: mdl-34929531

ABSTRACT

Both visual articulatory gestures and orthography provide information on the phonological content of speech. This EEG study investigated the integration between speech and these two visual inputs. A comparison of skilled readers' brain responses elicited by a spoken word presented alone versus synchronously with a static image of a viseme or a grapheme of the spoken word's onset showed that while neither visual input induced audiovisual integration on N1 acoustic component, both led to a supra-additive integration on P2, with a stronger integration between speech and graphemes on left-anterior electrodes. This pattern persisted in P350 time-window and generalized to all electrodes. The finding suggests a strong impact of spelling knowledge on phonetic processing and lexical access. It also indirectly indicates that the dynamic and predictive value present in natural lip movements but not in static visemes is particularly critical to the contribution of visual articulatory gestures to speech processing.


Subject(s)
Phonetics , Speech Perception , Acoustic Stimulation , Electroencephalography/methods , Humans , Speech/physiology , Speech Perception/physiology , Visual Perception/physiology
2.
Neuropsychologia ; 50(12): 2897-2906, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22971812

ABSTRACT

Acquiring literacy establishes connections between the spoken and written system and modifies the functioning of the spoken system. As most evidence comes from on-line speech recognition tasks, it is still a matter of debate when and how these two systems interact in metaphonological tasks. The present event-related potentials study investigated the role and activation time course of the phonological and orthographic representations in an auditory same/different phoneme judgment task in which the congruency between phoneme and grapheme was orthogonally manipulated. We reported distinct time windows and topographies for phonological and orthographic effects. The phonological effect emerged early at central and parietal electrode sites and faded away later on, whereas the orthographic effect increased progressively, first observable at central and parietal sites before generalizing at the frontal site. These effects are clearly different from what has been reported in speech recognition tasks and suggest that our cognitive system is flexible enough to adjust its functioning to respond to the task demands in an optimal way.


Subject(s)
Cerebral Cortex/physiology , Evoked Potentials, Auditory/physiology , Evoked Potentials/physiology , Phonetics , Speech Perception/physiology , Acoustic Stimulation , Adolescent , Adult , Brain Mapping , Electroencephalography , Female , Frontal Lobe/physiology , Humans , Male , Parietal Lobe/physiology , Reaction Time , Time Factors
3.
J Neurosci ; 30(25): 8435-44, 2010 Jun 23.
Article in English | MEDLINE | ID: mdl-20573891

ABSTRACT

Behavioral studies have demonstrated that learning to read and write affects the processing of spoken language. The present study investigates the neural mechanism underlying the emergence of such orthographic effects during speech processing. Transcranial magnetic stimulation (TMS) was used to tease apart two competing hypotheses that consider this orthographic influence to be either a consequence of a change in the nature of the phonological representations during literacy acquisition or a consequence of online coactivation of the orthographic and phonological representations during speech processing. Participants performed an auditory lexical decision task in which the orthographic consistency of spoken words was manipulated and repetitive TMS was used to interfere with either phonological or orthographic processing by stimulating left supramarginal gyrus (SMG) or left ventral occipitotemporal cortex (vOTC), respectively. The advantage for consistently spelled words was removed only when the stimulation was delivered to SMG and not to vOTC, providing strong evidence that this effect arises at a phonological, rather than an orthographic, level. We propose a possible mechanistic explanation for the role of SMG in phonological processing and how this is affected by learning to read.


Subject(s)
Occipital Lobe/physiology , Reading , Speech Perception/physiology , Temporal Lobe/physiology , Acoustic Stimulation , Adolescent , Adult , Analysis of Variance , Brain Mapping , Decision Making/physiology , Female , Humans , Male , Reaction Time/physiology , Transcranial Magnetic Stimulation
4.
J Cogn Neurosci ; 21(1): 169-79, 2009 Jan.
Article in English | MEDLINE | ID: mdl-18476763

ABSTRACT

Literacy changes the way the brain processes spoken language. Most psycholinguists believe that orthographic effects on spoken language are either strategic or restricted to meta-phonological tasks. We used event-related brain potentials (ERPs) to investigate the locus and the time course of orthographic effects on spoken word recognition in a semantic task. Participants were asked to decide whether a given word belonged to a semantic category (body parts). On no-go trials, words were presented that were either orthographically consistent or inconsistent. Orthographic inconsistency (i.e., multiple spellings of the same phonology) could occur either in the first or the second syllable. The ERP data showed a clear orthographic consistency effect that preceded lexical access and semantic effects. Moreover, the onset of the orthographic consistency effect was time-locked to the arrival of the inconsistency in a spoken word, which suggests that orthography influences spoken language in a time-dependent manner. The present data join recent evidence from brain imaging showing orthographic activation in spoken language tasks. Our results extend those findings by showing that orthographic activation occurs early and affects spoken word recognition in a semantic task that does not require the explicit processing of orthographic or phonological structure.


Subject(s)
Comprehension/physiology , Concept Formation , Evoked Potentials/physiology , Reaction Time/physiology , Speech Perception/physiology , Acoustic Stimulation , Adult , Cerebral Cortex/physiology , Choice Behavior/physiology , Electroencephalography , Female , Humans , Male , Mental Processes/physiology , Psycholinguistics , Reference Values , Self Concept , Semantics , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL