Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 3.917
Filter
1.
Brain Cogn ; 177: 106161, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38696928

ABSTRACT

Narrative comprehension relies on basic sensory processing abilities, such as visual and auditory processing, with recent evidence for utilizing executive functions (EF), which are also engaged during reading. EF was previously related to the "supporter" of engaging the auditory and visual modalities in different cognitive tasks, with evidence of lower efficiency in this process among those with reading difficulties in the absence of a visual stimulus (i.e. while listening to stories). The current study aims to fill out the gap related to the level of reliance on these neural circuits while visual aids (pictures) are involved during story listening in relation to reading skills. Functional MRI data were collected from 44 Hebrew-speaking children aged 8-12 years while listening to stories with vs without visual stimuli (i.e., pictures). Functional connectivity of networks supporting reading was defined in each condition and compared between the conditions against behavioral reading measures. Lower reading skills were related to greater functional connectivity values between EF networks (default mode and memory networks), and between the auditory and memory networks for the stories with vs without the visual stimulation. A greater difference in functional connectivity between the conditions was related to lower reading scores. We conclude that lower reading skills in children may be related to a need for greater scaffolding, i.e., visual stimulation such as pictures describing the narratives when listening to stories, which may guide future intervention approaches.


Subject(s)
Executive Function , Magnetic Resonance Imaging , Reading , Visual Perception , Humans , Child , Male , Female , Executive Function/physiology , Visual Perception/physiology , Auditory Perception/physiology , Comprehension/physiology , Photic Stimulation/methods , Nerve Net/physiology , Nerve Net/diagnostic imaging , Brain/physiology
2.
J Psycholinguist Res ; 53(3): 42, 2024 May 04.
Article in English | MEDLINE | ID: mdl-38703330

ABSTRACT

This study aims to expand our understanding of the relations of oral reading fluency at word, sentence, and passage levels to reading comprehension in Chinese-speaking secondary school-aged students. In total, 80 participants (46 males and 34 females) ranging from 13 to 15 years old joined this study and were tested on tasks of oral reading fluency at three levels, reading comprehension, and nonverbal IQ. Our results showed a clear relationship from fluency at the level of the word to the sentence and then the passage in oral reading fluency as well as both the direct and indirect importance of word-level oral reading fluency in reading comprehension. Only the indirect effect from word-level oral reading fluency to reading comprehension through passage-level oral reading fluency was significant. Our findings suggest that sentence-level oral reading fluency is the crucial component to reading comprehension in Chinese. Additionally, recognition of the potential value of unique features, such as syntactic awareness and word segment accuracy, that happen at the sentence level should be integrated into instructional activities for reading.


Subject(s)
Comprehension , Reading , Humans , Comprehension/physiology , Male , Female , Adolescent , China , Language , Psycholinguistics , East Asian People
3.
PLoS One ; 19(5): e0301090, 2024.
Article in English | MEDLINE | ID: mdl-38709767

ABSTRACT

Understanding the nervous system is an important but perhaps ambitious goal, particularly for students in lower secondary education. It is important because of its' direct role in both mental and physical health, and it is ambitious because instruction focuses on the human nervous system, which is extremely complex, and subject to numerous misconceptions. Despite its' complexity, the science curricula, both nationally and internationally, emphasize an understanding of the system, and not just knowledge of isolated facts. But what does it mean to understand this system, and what content knowledge is critical for understanding it? Unfortunately, the curricula are usually too general to answer these questions, therefore other sources of information are needed. Using the science literature, the present study defines the system level of the nervous system and proposes three basic aspects necessary to understand it: 1) neural circuit architecture, 2) synaptic action, and 3) nerve signal origin. With this background, the aim of the present study is to identify lower secondary school students' conceptions of these three aspects, and to determine how they impact students' understanding of the system. To reach this aim, the study used a questionary which allowed for a mixed method design, and the results show that many students have an immediate conception of the brain as the origin of nerve signals. In addition, many students hold the alternative conceptions that 1) synaptic action is exclusively excitatory, and that 2) neural circuits consists of neurons connected in a chain, one single neuron after another. These alternative conceptions prevent students from understanding the system. Implications for instruction are discussed in the context of conceptual learning theories, and teaching strategies are proposed. Since similar curricula goals and textbook content exist in several countries, the present results may be representative across nations.


Subject(s)
Students , Humans , Students/psychology , Adolescent , Male , Female , Nervous System , Schools , Comprehension/physiology , Curriculum
4.
Curr Biol ; 34(9): R348-R351, 2024 May 06.
Article in English | MEDLINE | ID: mdl-38714162

ABSTRACT

A recent study has used scalp-recorded electroencephalography to obtain evidence of semantic processing of human speech and objects by domesticated dogs. The results suggest that dogs do comprehend the meaning of familiar spoken words, in that a word can evoke the mental representation of the object to which it refers.


Subject(s)
Cognition , Semantics , Animals , Dogs/psychology , Cognition/physiology , Humans , Electroencephalography , Speech/physiology , Speech Perception/physiology , Comprehension/physiology
5.
Cereb Cortex ; 34(5)2024 May 02.
Article in English | MEDLINE | ID: mdl-38715408

ABSTRACT

Speech comprehension in noise depends on complex interactions between peripheral sensory and central cognitive systems. Despite having normal peripheral hearing, older adults show difficulties in speech comprehension. It remains unclear whether the brain's neural responses could indicate aging. The current study examined whether individual brain activation during speech perception in different listening environments could predict age. We applied functional near-infrared spectroscopy to 93 normal-hearing human adults (20 to 70 years old) during a sentence listening task, which contained a quiet condition and 4 different signal-to-noise ratios (SNR = 10, 5, 0, -5 dB) noisy conditions. A data-driven approach, the region-based brain-age predictive modeling was adopted. We observed a significant behavioral decrease with age under the 4 noisy conditions, but not under the quiet condition. Brain activations in SNR = 10 dB listening condition could successfully predict individual's age. Moreover, we found that the bilateral visual sensory cortex, left dorsal speech pathway, left cerebellum, right temporal-parietal junction area, right homolog Wernicke's area, and right middle temporal gyrus contributed most to prediction performance. These results demonstrate that the activations of regions about sensory-motor mapping of sound, especially in noisy conditions, could be sensitive measures for age prediction than external behavior measures.


Subject(s)
Aging , Brain , Comprehension , Noise , Spectroscopy, Near-Infrared , Speech Perception , Humans , Adult , Speech Perception/physiology , Male , Female , Spectroscopy, Near-Infrared/methods , Middle Aged , Young Adult , Aged , Comprehension/physiology , Brain/physiology , Brain/diagnostic imaging , Aging/physiology , Brain Mapping/methods , Acoustic Stimulation/methods
6.
Laryngorhinootologie ; 103(4): 252-260, 2024 Apr.
Article in German | MEDLINE | ID: mdl-38565108

ABSTRACT

Language processing can be measured objectively using late components in the evoked brain potential. The most established component in this area of research is the N400 component, a negativity that peaks at about 400 ms after stimulus onset with a centro-parietal maximum. It reflects semantic processing. Its presence, as well as its temporal and quantitative expression, allows to conclude about the quality of processing. It is therefore suitable for measuring speech comprehension in special populations, such as cochlear implant (CI) users. The following is an overview of the use of the N400 component as a tool for studying language processes in CI users. We present studies with adult CI users, where the N400 reflects the quality of speech comprehension with the new hearing device and we present studies with children where the emergence of the N400 component reflects the acquisition of their very first vocabulary.


Subject(s)
Cochlear Implants , Speech Perception , Adult , Child , Female , Humans , Male , Comprehension/physiology , Electroencephalography , Evoked Potentials/physiology , Language , Language Development , Semantics , Speech Perception/physiology
7.
Neuroreport ; 35(9): 584-589, 2024 Jun 05.
Article in English | MEDLINE | ID: mdl-38687896

ABSTRACT

OBJECTIVE: This study examined the effect of context on the prediction of emotional words with varying valences. It investigated the neural mechanisms underlying the processing differences of emotion words with different valences in both predictable and unpredictable contexts. Additionally, it aimed to address the conflicting results regarding the processing time in predictive contexts reported in previous studies. METHODS: Participants were instructed to carefully read the text that included the specified emotion words. Event-related potentials elicited by emotional words were measured. To ensure that the participants can read the text carefully, 33% of the texts are followed by comprehension problems. After reading the text, the comprehension questions were answered based on the text content. RESULTS: The study revealed that the N400 amplitude elicited by an unpredictable context was greater than that elicited by a predictable context. Additionally, the N400 amplitude triggered by positive emotion words was larger than that triggered by negative emotion words. However, there was no significant difference in late positive component amplitude observed between contextual prediction and emotional word valence. CONCLUSION: The present study suggests that predictive processing takes place at an intermediate stage of speech processing, approximately 400 ms after stimulus onset. Furthermore, the presence of a predictive context enhances the processing of emotional information. Notably, brain activity is more pronounced during the processing of positive emotional stimuli compared to negative emotional stimuli. Additionally, the facilitative effect of a predictable context diminishes in the advanced phase of Chinese speech comprehension.


Subject(s)
Electroencephalography , Emotions , Evoked Potentials , Reading , Humans , Emotions/physiology , Evoked Potentials/physiology , Female , Male , Young Adult , Brain/physiology , Adult , Comprehension/physiology
8.
Acta Psychol (Amst) ; 246: 104241, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38613853

ABSTRACT

Previous research on real-time sentence processing in German has shown that listeners use the morphological marking of accusative case on a sentence-initial noun phrase to not only interpret the current argument as the object and patient, but also to predict a plausible agent. So far, less is known about the use of case marking to predict the semantic role of upcoming arguments after the subject/agent has been encountered. In the present study, we examined the use of case marking for argument interpretation in transitive as well as ditransitive structures. We aimed to control for multiple factors that could have influenced processing in previous studies, including the animacy of arguments, world knowledge, and the perceptibility of the case cue. Our results from eye- and mouse-tracking indicate that the exploitation of the first case cue that enables the interpretation of the unfolding sentence is influenced by (i) the strength of argument order expectation and (ii) the perceptual salience of the case cue. PsycINFO code: 2720 Linguistics & Language & Speech.


Subject(s)
Eye Movements , Humans , Adult , Eye Movements/physiology , Female , Male , Germany , Eye-Tracking Technology , Young Adult , Comprehension/physiology , Psycholinguistics , Cues , Semantics , Language
9.
Brain Lang ; 252: 105413, 2024 May.
Article in English | MEDLINE | ID: mdl-38608511

ABSTRACT

Sign languages (SLs) are expressed through different bodily actions, ranging from re-enactment of physical events (constructed action, CA) to sequences of lexical signs with internal structure (plain telling, PT). Despite the prevalence of CA in signed interactions and its significance for SL comprehension, its neural dynamics remain unexplored. We examined the processing of different types of CA (subtle, reduced, and overt) and PT in 35 adult deaf or hearing native signers. The electroencephalographic-based processing of signed sentences with incongruent targets was recorded. Attenuated N300 and early N400 were observed for CA in deaf but not in hearing signers. No differences were found between sentences with CA types in all signers, suggesting a continuum from PT to overt CA. Deaf signers focused more on body movements; hearing signers on faces. We conclude that CA is processed less effortlessly than PT, arguably because of its strong focus on bodily actions.


Subject(s)
Comprehension , Deafness , Electroencephalography , Sign Language , Humans , Comprehension/physiology , Adult , Male , Female , Deafness/physiopathology , Young Adult , Brain/physiology , Evoked Potentials/physiology
10.
J Exp Child Psychol ; 243: 105925, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38608513

ABSTRACT

In the current study, we investigated the role of executive functions in explaining how word recognition and language comprehension jointly predict reading comprehension in multilingual and monolingual students (Grades 1 and 2). Specifically, mediation and moderation models were tested and compared to offer a more nuanced understanding of the role of executive functions in reading comprehension. The results provided support for the mediation model in which executive functions indirectly contribute to reading comprehension via word recognition and language comprehension in both language groups. In addition, executive functions directly predicted reading comprehension (i.e., partial mediation). These findings suggest that executive functions serve as general cognitive processes that support word recognition, language comprehension, and reading comprehension (i.e., direct contribution) as well as facilitate connecting word recognition and language comprehension in support for reading comprehension (i.e., indirect contribution). These findings are consistent with prominent models of reading comprehension.


Subject(s)
Comprehension , Executive Function , Multilingualism , Reading , Humans , Comprehension/physiology , Executive Function/physiology , Female , Male , Child , Language
11.
PLoS One ; 19(4): e0298740, 2024.
Article in English | MEDLINE | ID: mdl-38669282

ABSTRACT

In this research, we employed functional magnetic resonance imaging (fMRI) to examine the neurological basis for understanding wh-questions in wh-in-situ languages such as Korean, where wh-elements maintain their original positions instead of moving explicitly within the sentence. Our hypothesis centered on the role of the salience and attention network in comprehending wh-questions in wh-in-situ languages, such as the discernment of wh-elements, the demarcation between interrogative types, and the allocation of cognitive resources towards essential constituents vis-à-vis subordinate elements in order to capture the speaker's communicative intent. We explored subject and object wh-questions and scrambled wh-questions, contrasting them with yes/no questions in Korean. Increased activation was observed in the left anterior insula and bilateral frontal operculum, irrespective of the wh-position or scrambling of wh-element. These results suggest the interaction between the salience and attentional system and the syntactic linguistic system, particularly the left anterior insula and bilateral frontal operculum, in comprehending wh-questions in wh-in-situ languages.


Subject(s)
Comprehension , Language , Magnetic Resonance Imaging , Humans , Female , Male , Comprehension/physiology , Adult , Young Adult , Brain Mapping , Frontal Lobe/physiology , Frontal Lobe/diagnostic imaging , Republic of Korea , Insular Cortex/physiology , Insular Cortex/diagnostic imaging
12.
Cereb Cortex ; 34(4)2024 Apr 01.
Article in English | MEDLINE | ID: mdl-38687241

ABSTRACT

Speech comprehension entails the neural mapping of the acoustic speech signal onto learned linguistic units. This acousto-linguistic transformation is bi-directional, whereby higher-level linguistic processes (e.g. semantics) modulate the acoustic analysis of individual linguistic units. Here, we investigated the cortical topography and linguistic modulation of the most fundamental linguistic unit, the phoneme. We presented natural speech and "phoneme quilts" (pseudo-randomly shuffled phonemes) in either a familiar (English) or unfamiliar (Korean) language to native English speakers while recording functional magnetic resonance imaging. This allowed us to dissociate the contribution of acoustic vs. linguistic processes toward phoneme analysis. We show that (i) the acoustic analysis of phonemes is modulated by linguistic analysis and (ii) that for this modulation, both of acoustic and phonetic information need to be incorporated. These results suggest that the linguistic modulation of cortical sensitivity to phoneme classes minimizes prediction error during natural speech perception, thereby aiding speech comprehension in challenging listening situations.


Subject(s)
Brain Mapping , Magnetic Resonance Imaging , Phonetics , Speech Perception , Humans , Speech Perception/physiology , Female , Magnetic Resonance Imaging/methods , Male , Adult , Young Adult , Linguistics , Acoustic Stimulation/methods , Comprehension/physiology , Brain/physiology , Brain/diagnostic imaging
13.
Neuropsychologia ; 198: 108881, 2024 Jun 06.
Article in English | MEDLINE | ID: mdl-38579906

ABSTRACT

As emoji often appear naturally alongside text in utterances, they provide a way to study how prediction unfolds in multimodal sentences in direct comparison to unimodal sentences. In this experiment, participants (N = 40) read sentences in which the sentence-final noun appeared in either word form or emoji form, a between-subjects manipulation. The experiment featured both high constraint sentences and low constraint sentences to examine how the lexical processing of emoji interacts with prediction processes in sentence comprehension. Two well-established ERP components linked to lexical processing and prediction - the N400 and the Late Frontal Positivity - are investigated for sentence-final words and emoji to assess whether, to what extent, and in what linguistic contexts emoji are processed like words. Results indicate that the expected effects, namely an N400 effect to an implausible lexical item compared to a plausible one and an LFP effect to an unexpected lexical item compared to an expected one, emerged for both words and emoji. This paper discusses the similarities and differences between the stimulus types and constraint conditions, contextualized within theories of linguistic prediction, ERP components, and a multimodal lexicon.


Subject(s)
Comprehension , Electroencephalography , Evoked Potentials , Reading , Humans , Male , Female , Evoked Potentials/physiology , Young Adult , Comprehension/physiology , Adult , Brain/physiology , Semantics , Adolescent , Psycholinguistics , Emotions/physiology
14.
eNeuro ; 11(4)2024 Apr.
Article in English | MEDLINE | ID: mdl-38490743

ABSTRACT

Research into the role of brain oscillations in basic perceptual and cognitive functions has suggested that the alpha rhythm reflects functional inhibition while the beta rhythm reflects neural ensemble (re)activation. However, little is known regarding the generalization of these proposed fundamental operations to linguistic processes, such as speech comprehension and production. Here, we recorded magnetoencephalography in participants performing a novel rule-switching paradigm. Specifically, Dutch native speakers had to produce an alternative exemplar from the same category or a feature of a given target word embedded in spoken sentences (e.g., for the word "tuna", an exemplar from the same category-"seafood"-would be "shrimp", and a feature would be "pink"). A cue indicated the task rule-exemplar or feature-either before (pre-cue) or after (retro-cue) listening to the sentence. Alpha power during the working memory delay was lower for retro-cue compared with that for pre-cue in the left hemispheric language-related regions. Critically, alpha power negatively correlated with reaction times, suggestive of alpha facilitating task performance by regulating inhibition in regions linked to lexical retrieval. Furthermore, we observed a different spatiotemporal pattern of beta activity for exemplars versus features in the right temporoparietal regions, in line with the proposed role of beta in recruiting neural networks for the encoding of distinct categories. Overall, our study provides evidence for the generalizability of the role of alpha and beta oscillations from perceptual to more "complex, linguistic processes" and offers a novel task to investigate links between rule-switching, working memory, and word production.


Subject(s)
Brain , Language , Humans , Brain/physiology , Magnetoencephalography , Comprehension/physiology , Linguistics , Alpha Rhythm/physiology
15.
J Exp Psychol Gen ; 153(5): 1407-1413, 2024 May.
Article in English | MEDLINE | ID: mdl-38451701

ABSTRACT

During conversations, people face a trade-off between establishing understanding and making interesting and unique contributions. How do people balance this when deciding which concepts to reference, and does it matter how well they know their conversation partner? In the present work, participants made stream-of-consciousness word associations either with a partner or alone-simplified versions of dialogue and monologue. Participants made semantically narrower and more predictable word associations with a stranger than alone (Study 1), suggesting that they constrain their associations to establish mutual understanding. Increasing closeness (Study 2) or having a prior relationship (Study 3) did not moderate this effect. Thus, even during a task that does not depend on establishing mutual understanding, people sacrifice being interesting for the sake of being understood. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Subject(s)
Friends , Semantics , Humans , Male , Female , Adult , Friends/psychology , Young Adult , Comprehension/physiology , Interpersonal Relations , Communication , Association
16.
Curr Biol ; 34(8): 1750-1754.e4, 2024 Apr 22.
Article in English | MEDLINE | ID: mdl-38521063

ABSTRACT

Using words to refer to objects in the environment is a core feature of the human language faculty. Referential understanding assumes the formation of mental representations of these words.1,2 Such understanding of object words has not yet been demonstrated as a general capacity in any non-human species,3 despite multiple behavior-based case reports.4,5,6,7,8,9,10 In human event-related potential (ERP) studies, object word knowledge is typically tested using the semantic violation paradigm, where words are presented either with their referent (match) or another object (mismatch).11,12 Such mismatch elicits an N400 effect, a well-established neural correlate of semantic processing.12,13 Reports of preverbal infant N400 evoked by semantic violations14 assert the use of this paradigm to probe mental representations of object words in nonverbal populations. Here, measuring dogs' (Canis familiaris) ERPs to objects primed with matching or mismatching object words, we found a mismatch effect at a frontal electrode, with a latency (206-606 ms) comparable to the human N400. A greater difference for words that dogs knew better, according to owner reports, further supported a semantic interpretation of this effect. Semantic expectations emerged irrespective of vocabulary size, demonstrating the prevalence of referential understanding in dogs. These results provide the first neural evidence for object word knowledge in a non-human animal. VIDEO ABSTRACT.


Subject(s)
Evoked Potentials , Semantics , Animals , Dogs/physiology , Male , Female , Evoked Potentials/physiology , Comprehension/physiology , Electroencephalography , Humans
17.
Res Dev Disabil ; 148: 104711, 2024 May.
Article in English | MEDLINE | ID: mdl-38520885

ABSTRACT

BACKGROUND: Studies on late talkers (LTs) highlighted their heterogeneity and the relevance of describing different communicative profiles. AIMS: To examine lexical skills and gesture use in expressive (E-LTs) vs. receptive-expressive (R/E-LTs) LTs through a structured task. METHODS AND PROCEDURES: Forty-six 30-month-old screened LTs were distinguished into E-LTs (n= 35) and R/E-LTs (n= 11) according to their receptive skills. Lexical skills and gesture use were assessed with a Picture Naming Game by coding answer accuracy (correct, incorrect, no response), modality of expression (spoken, spoken-gestural, gestural), type of gestures (deictic, representational), and spoken-gestural answers' semantic relationship (complementary, equivalent, supplementary). OUTCOMES AND RESULTS: R/E-LTs showed lower scores than E-LTs for noun and predicate comprehension with fewer correct answers, and production with fewer correct and incorrect answers, and more no responses. R/E-LTs also exhibited lower scores in spoken answers, representational gestures, and equivalent spoken-gestural answers for noun production and in all spoken and gestural answers for predicate production. CONCLUSIONS AND IMPLICATIONS: Findings highlighted more impaired receptive and expressive lexical skills and lower gesture use in R/E-LTs compared to E-LTs, underlying the relevance of assessing both lexical and gestural skills through a structured task, besides parental questionnaires and developmental scales, to describe LTs' communicative profiles.


Subject(s)
Gestures , Language Development Disorders , Humans , Comprehension/physiology , Parents , Language Tests , Vocabulary
18.
J Cogn Neurosci ; 36(6): 1141-1155, 2024 06 01.
Article in English | MEDLINE | ID: mdl-38437175

ABSTRACT

Disagreements persist regarding the neural basis of syntactic processing, which has been linked both to inferior frontal and posterior temporal regions of the brain. One focal point of the debate concerns the role of inferior frontal areas in receptive syntactic ability, which is mostly assessed using sentence comprehension involving complex syntactic structures, a task that is potentially confounded with working memory. Syntactic acceptability judgments may provide a better measure of receptive syntax by reducing the need to use high working memory load and complex sentences and by enabling assessment of various types of syntactic violations. We therefore tested the perception of grammatical violations by people with poststroke aphasia (n = 25), along with matched controls (n = 16), using English sentences involving errors in word order, agreement, or subcategorization. Lesion data were also collected. Control participants performed near ceiling in accuracy with higher discriminability of agreement and subcategorization violations than word order; aphasia participants were less able to discriminate violations, but, on average, paralleled control participants discriminability of types of violations. Lesion-symptom mapping showed a correlation between discriminability and posterior temporal regions, but not inferior frontal regions. We argue that these results diverge from models holding that frontal areas are amodal core regions in syntactic structure building and favor models that posit a core hierarchical system in posterior temporal regions.


Subject(s)
Aphasia , Brain Mapping , Judgment , Stroke , Humans , Male , Aphasia/physiopathology , Aphasia/etiology , Female , Stroke/complications , Stroke/physiopathology , Middle Aged , Aged , Judgment/physiology , Magnetic Resonance Imaging , Comprehension/physiology , Chronic Disease , Semantics , Speech Perception/physiology , Adult
19.
Cereb Cortex ; 34(3)2024 03 01.
Article in English | MEDLINE | ID: mdl-38501383

ABSTRACT

A key question in research on the neurobiology of language is to which extent the language production and comprehension systems share neural infrastructure, but this question has not been addressed in the context of conversation. We utilized a public fMRI dataset where 24 participants engaged in unscripted conversations with a confederate outside the scanner, via an audio-video link. We provide evidence indicating that the two systems share neural infrastructure in the left-lateralized perisylvian language network, but diverge regarding the level of activation in regions within the network. Activity in the left inferior frontal gyrus was stronger in production compared to comprehension, while comprehension showed stronger recruitment of the left anterior middle temporal gyrus and superior temporal sulcus, compared to production. Although our results are reminiscent of the classical Broca-Wernicke model, the anterior (rather than posterior) temporal activation is a notable difference from that model. This is one of the findings that may be a consequence of the conversational setting, another being that conversational production activated what we interpret as higher-level socio-pragmatic processes. In conclusion, we present evidence for partial overlap and functional asymmetry of the neural infrastructure of production and comprehension, in the above-mentioned frontal vs temporal regions during conversation.


Subject(s)
Comprehension , Magnetic Resonance Imaging , Humans , Comprehension/physiology , Magnetic Resonance Imaging/methods , Communication , Language , Prefrontal Cortex , Brain Mapping
20.
Neurosci Biobehav Rev ; 160: 105624, 2024 May.
Article in English | MEDLINE | ID: mdl-38492763

ABSTRACT

Recent event-related potential (ERP) studies in language comprehension converge in finding anticipatory negativities preceding words or word segments that can be pre-activated based on either sentence contexts or phonological cues. We review these findings from different paradigms in the light of evidence from other cognitive domains in which slow negative potentials have long been associated with anticipatory processes and discuss their potential underlying mechanisms. We propose that this family of anticipatory negativities captures common mechanisms associated with the pre-activation of linguistic information both within words and within sentences. Future studies could utilize these anticipatory negativities in combination with other, well-established ERPs, to simultaneously track prediction-related processes emerging at different time intervals (before and after the perception of pre-activated input) and with distinct time courses (shorter-lived and longer-lived cognitive operations).


Subject(s)
Comprehension , Language , Humans , Comprehension/physiology , Evoked Potentials/physiology , Linguistics , Cues , Electroencephalography , Semantics
SELECTION OF CITATIONS
SEARCH DETAIL
...