Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 3.922
1.
Proc Natl Acad Sci U S A ; 121(23): e2311425121, 2024 Jun 04.
Article En | MEDLINE | ID: mdl-38814865

Theories of language development-informed largely by studies of Western, middleclass infants-have highlighted the language that caregivers direct to children as a key driver of language learning. However, some have argued that language development unfolds similarly across environmental contexts, including those in which childdirected language is scarce. This raises the possibility that children are able to learn from other sources of language in their environments, particularly the language directed to others in their environment. We explore this hypothesis with infants in an indigenous Tseltal-speaking community in Southern Mexico who are rarely spoken to, yet have the opportunity to overhear a great deal of other-directed language by virtue of being carried on their mothers' backs. Adapting a previously established gaze-tracking method for detecting early word knowledge to our field setting, we find that Tseltal infants exhibit implicit knowledge of common nouns (Exp. 1), analogous to their US peers who are frequently spoken to. Moreover, they exhibit comprehension of Tseltal honorific terms that are exclusively used to greet adults in the community (Exp. 2), representing language that could only have been learned through overhearing. In so doing, Tseltal infants demonstrate an ability to discriminate words with similar meanings and perceptually similar referents at an earlier age than has been shown among Western children. Together, these results suggest that for some infants, learning from overhearing may be an important path toward developing language.


Comprehension , Language Development , Humans , Infant , Female , Male , Comprehension/physiology , Mexico , Language , Vocabulary
2.
Brain Cogn ; 177: 106161, 2024 Jun.
Article En | MEDLINE | ID: mdl-38696928

Narrative comprehension relies on basic sensory processing abilities, such as visual and auditory processing, with recent evidence for utilizing executive functions (EF), which are also engaged during reading. EF was previously related to the "supporter" of engaging the auditory and visual modalities in different cognitive tasks, with evidence of lower efficiency in this process among those with reading difficulties in the absence of a visual stimulus (i.e. while listening to stories). The current study aims to fill out the gap related to the level of reliance on these neural circuits while visual aids (pictures) are involved during story listening in relation to reading skills. Functional MRI data were collected from 44 Hebrew-speaking children aged 8-12 years while listening to stories with vs without visual stimuli (i.e., pictures). Functional connectivity of networks supporting reading was defined in each condition and compared between the conditions against behavioral reading measures. Lower reading skills were related to greater functional connectivity values between EF networks (default mode and memory networks), and between the auditory and memory networks for the stories with vs without the visual stimulation. A greater difference in functional connectivity between the conditions was related to lower reading scores. We conclude that lower reading skills in children may be related to a need for greater scaffolding, i.e., visual stimulation such as pictures describing the narratives when listening to stories, which may guide future intervention approaches.


Executive Function , Magnetic Resonance Imaging , Reading , Visual Perception , Humans , Child , Male , Female , Executive Function/physiology , Visual Perception/physiology , Auditory Perception/physiology , Comprehension/physiology , Photic Stimulation/methods , Nerve Net/physiology , Nerve Net/diagnostic imaging , Brain/physiology
3.
PLoS One ; 19(5): e0301090, 2024.
Article En | MEDLINE | ID: mdl-38709767

Understanding the nervous system is an important but perhaps ambitious goal, particularly for students in lower secondary education. It is important because of its' direct role in both mental and physical health, and it is ambitious because instruction focuses on the human nervous system, which is extremely complex, and subject to numerous misconceptions. Despite its' complexity, the science curricula, both nationally and internationally, emphasize an understanding of the system, and not just knowledge of isolated facts. But what does it mean to understand this system, and what content knowledge is critical for understanding it? Unfortunately, the curricula are usually too general to answer these questions, therefore other sources of information are needed. Using the science literature, the present study defines the system level of the nervous system and proposes three basic aspects necessary to understand it: 1) neural circuit architecture, 2) synaptic action, and 3) nerve signal origin. With this background, the aim of the present study is to identify lower secondary school students' conceptions of these three aspects, and to determine how they impact students' understanding of the system. To reach this aim, the study used a questionary which allowed for a mixed method design, and the results show that many students have an immediate conception of the brain as the origin of nerve signals. In addition, many students hold the alternative conceptions that 1) synaptic action is exclusively excitatory, and that 2) neural circuits consists of neurons connected in a chain, one single neuron after another. These alternative conceptions prevent students from understanding the system. Implications for instruction are discussed in the context of conceptual learning theories, and teaching strategies are proposed. Since similar curricula goals and textbook content exist in several countries, the present results may be representative across nations.


Students , Humans , Students/psychology , Adolescent , Male , Female , Nervous System , Schools , Comprehension/physiology , Curriculum
4.
Cereb Cortex ; 34(5)2024 May 02.
Article En | MEDLINE | ID: mdl-38715408

Speech comprehension in noise depends on complex interactions between peripheral sensory and central cognitive systems. Despite having normal peripheral hearing, older adults show difficulties in speech comprehension. It remains unclear whether the brain's neural responses could indicate aging. The current study examined whether individual brain activation during speech perception in different listening environments could predict age. We applied functional near-infrared spectroscopy to 93 normal-hearing human adults (20 to 70 years old) during a sentence listening task, which contained a quiet condition and 4 different signal-to-noise ratios (SNR = 10, 5, 0, -5 dB) noisy conditions. A data-driven approach, the region-based brain-age predictive modeling was adopted. We observed a significant behavioral decrease with age under the 4 noisy conditions, but not under the quiet condition. Brain activations in SNR = 10 dB listening condition could successfully predict individual's age. Moreover, we found that the bilateral visual sensory cortex, left dorsal speech pathway, left cerebellum, right temporal-parietal junction area, right homolog Wernicke's area, and right middle temporal gyrus contributed most to prediction performance. These results demonstrate that the activations of regions about sensory-motor mapping of sound, especially in noisy conditions, could be sensitive measures for age prediction than external behavior measures.


Aging , Brain , Comprehension , Noise , Spectroscopy, Near-Infrared , Speech Perception , Humans , Adult , Speech Perception/physiology , Male , Female , Spectroscopy, Near-Infrared/methods , Middle Aged , Young Adult , Aged , Comprehension/physiology , Brain/physiology , Brain/diagnostic imaging , Aging/physiology , Brain Mapping/methods , Acoustic Stimulation/methods
5.
Sci Adv ; 10(21): eadn7744, 2024 May 24.
Article En | MEDLINE | ID: mdl-38781343

Current large language models (LLMs) rely on word prediction as their backbone pretraining task. Although word prediction is an important mechanism underlying language processing, human language comprehension occurs at multiple levels, involving the integration of words and sentences to achieve a full understanding of discourse. This study models language comprehension by using the next sentence prediction (NSP) task to investigate mechanisms of discourse-level comprehension. We show that NSP pretraining enhanced a model's alignment with brain data especially in the right hemisphere and in the multiple demand network, highlighting the contributions of nonclassical language regions to high-level language understanding. Our results also suggest that NSP can enable the model to better capture human comprehension performance and to better encode contextual information. Our study demonstrates that the inclusion of diverse learning objectives in a model leads to more human-like representations, and investigating the neurocognitive plausibility of pretraining tasks in LLMs can shed light on outstanding questions in language neuroscience.


Brain , Comprehension , Language , Humans , Comprehension/physiology , Brain/physiology
6.
J Psycholinguist Res ; 53(3): 42, 2024 May 04.
Article En | MEDLINE | ID: mdl-38703330

This study aims to expand our understanding of the relations of oral reading fluency at word, sentence, and passage levels to reading comprehension in Chinese-speaking secondary school-aged students. In total, 80 participants (46 males and 34 females) ranging from 13 to 15 years old joined this study and were tested on tasks of oral reading fluency at three levels, reading comprehension, and nonverbal IQ. Our results showed a clear relationship from fluency at the level of the word to the sentence and then the passage in oral reading fluency as well as both the direct and indirect importance of word-level oral reading fluency in reading comprehension. Only the indirect effect from word-level oral reading fluency to reading comprehension through passage-level oral reading fluency was significant. Our findings suggest that sentence-level oral reading fluency is the crucial component to reading comprehension in Chinese. Additionally, recognition of the potential value of unique features, such as syntactic awareness and word segment accuracy, that happen at the sentence level should be integrated into instructional activities for reading.


Comprehension , Reading , Humans , Comprehension/physiology , Male , Female , Adolescent , China , Language , Psycholinguistics , East Asian People
7.
PLoS One ; 19(5): e0290807, 2024.
Article En | MEDLINE | ID: mdl-38776360

We report the first use of ERP measures to identify text engagement differences when reading digitally or in print. Depth of semantic encoding is key for reading comprehension, and we predicted that deeper reading of expository texts would facilitate stronger associations with subsequently-presented related words, resulting in enhanced N400 responses to unrelated probe words and a graded attenuation of the N400 to related and moderately related words. In contrast, shallow reading would produce weaker associations between probe words and text passages, resulting in enhanced N400 responses to both moderately related and unrelated words, and an attenuated response to related words. Behavioral research has shown deeper semantic encoding of text from paper than from a screen. Hence, we predicted that the N400 would index deeper reading of text passages that were presented in print, and shallower reading of texts presented digitally. Middle-school students (n = 59) read passages in digital and print formats and high-density EEG was recorded while participants completed single-word semantic judgment tasks after each passage. Following digital text presentation, the N400 response pattern to moderately-related words indicated shallow reading, tracking with responses to words that were unrelated to the text. Following print reading, the N400 responses to moderately-related words patterned instead with responses to related words, interpreted as an index of deeper reading. These findings provide evidence of differences in brain responses to texts presented in print and digital media, including deeper semantic encoding for print than digital texts.


Electroencephalography , Evoked Potentials , Reading , Semantics , Humans , Female , Male , Evoked Potentials/physiology , Adolescent , Child , Comprehension/physiology
8.
Curr Biol ; 34(9): R348-R351, 2024 05 06.
Article En | MEDLINE | ID: mdl-38714162

A recent study has used scalp-recorded electroencephalography to obtain evidence of semantic processing of human speech and objects by domesticated dogs. The results suggest that dogs do comprehend the meaning of familiar spoken words, in that a word can evoke the mental representation of the object to which it refers.


Cognition , Semantics , Animals , Dogs/psychology , Cognition/physiology , Humans , Electroencephalography , Speech/physiology , Speech Perception/physiology , Comprehension/physiology
9.
Acta Psychol (Amst) ; 246: 104241, 2024 Jun.
Article En | MEDLINE | ID: mdl-38613853

Previous research on real-time sentence processing in German has shown that listeners use the morphological marking of accusative case on a sentence-initial noun phrase to not only interpret the current argument as the object and patient, but also to predict a plausible agent. So far, less is known about the use of case marking to predict the semantic role of upcoming arguments after the subject/agent has been encountered. In the present study, we examined the use of case marking for argument interpretation in transitive as well as ditransitive structures. We aimed to control for multiple factors that could have influenced processing in previous studies, including the animacy of arguments, world knowledge, and the perceptibility of the case cue. Our results from eye- and mouse-tracking indicate that the exploitation of the first case cue that enables the interpretation of the unfolding sentence is influenced by (i) the strength of argument order expectation and (ii) the perceptual salience of the case cue. PsycINFO code: 2720 Linguistics & Language & Speech.


Eye Movements , Humans , Adult , Eye Movements/physiology , Female , Male , Germany , Eye-Tracking Technology , Young Adult , Comprehension/physiology , Psycholinguistics , Cues , Semantics , Language
10.
J Exp Child Psychol ; 243: 105925, 2024 Jul.
Article En | MEDLINE | ID: mdl-38608513

In the current study, we investigated the role of executive functions in explaining how word recognition and language comprehension jointly predict reading comprehension in multilingual and monolingual students (Grades 1 and 2). Specifically, mediation and moderation models were tested and compared to offer a more nuanced understanding of the role of executive functions in reading comprehension. The results provided support for the mediation model in which executive functions indirectly contribute to reading comprehension via word recognition and language comprehension in both language groups. In addition, executive functions directly predicted reading comprehension (i.e., partial mediation). These findings suggest that executive functions serve as general cognitive processes that support word recognition, language comprehension, and reading comprehension (i.e., direct contribution) as well as facilitate connecting word recognition and language comprehension in support for reading comprehension (i.e., indirect contribution). These findings are consistent with prominent models of reading comprehension.


Comprehension , Executive Function , Multilingualism , Reading , Humans , Comprehension/physiology , Executive Function/physiology , Female , Male , Child , Language
11.
Laryngorhinootologie ; 103(4): 252-260, 2024 Apr.
Article De | MEDLINE | ID: mdl-38565108

Language processing can be measured objectively using late components in the evoked brain potential. The most established component in this area of research is the N400 component, a negativity that peaks at about 400 ms after stimulus onset with a centro-parietal maximum. It reflects semantic processing. Its presence, as well as its temporal and quantitative expression, allows to conclude about the quality of processing. It is therefore suitable for measuring speech comprehension in special populations, such as cochlear implant (CI) users. The following is an overview of the use of the N400 component as a tool for studying language processes in CI users. We present studies with adult CI users, where the N400 reflects the quality of speech comprehension with the new hearing device and we present studies with children where the emergence of the N400 component reflects the acquisition of their very first vocabulary.


Cochlear Implants , Speech Perception , Adult , Child , Female , Humans , Male , Comprehension/physiology , Electroencephalography , Evoked Potentials/physiology , Language , Language Development , Semantics , Speech Perception/physiology
12.
PLoS One ; 19(4): e0298740, 2024.
Article En | MEDLINE | ID: mdl-38669282

In this research, we employed functional magnetic resonance imaging (fMRI) to examine the neurological basis for understanding wh-questions in wh-in-situ languages such as Korean, where wh-elements maintain their original positions instead of moving explicitly within the sentence. Our hypothesis centered on the role of the salience and attention network in comprehending wh-questions in wh-in-situ languages, such as the discernment of wh-elements, the demarcation between interrogative types, and the allocation of cognitive resources towards essential constituents vis-à-vis subordinate elements in order to capture the speaker's communicative intent. We explored subject and object wh-questions and scrambled wh-questions, contrasting them with yes/no questions in Korean. Increased activation was observed in the left anterior insula and bilateral frontal operculum, irrespective of the wh-position or scrambling of wh-element. These results suggest the interaction between the salience and attentional system and the syntactic linguistic system, particularly the left anterior insula and bilateral frontal operculum, in comprehending wh-questions in wh-in-situ languages.


Comprehension , Language , Magnetic Resonance Imaging , Humans , Female , Male , Comprehension/physiology , Adult , Young Adult , Brain Mapping , Frontal Lobe/physiology , Frontal Lobe/diagnostic imaging , Republic of Korea , Insular Cortex/physiology , Insular Cortex/diagnostic imaging
13.
Cereb Cortex ; 34(4)2024 Apr 01.
Article En | MEDLINE | ID: mdl-38687241

Speech comprehension entails the neural mapping of the acoustic speech signal onto learned linguistic units. This acousto-linguistic transformation is bi-directional, whereby higher-level linguistic processes (e.g. semantics) modulate the acoustic analysis of individual linguistic units. Here, we investigated the cortical topography and linguistic modulation of the most fundamental linguistic unit, the phoneme. We presented natural speech and "phoneme quilts" (pseudo-randomly shuffled phonemes) in either a familiar (English) or unfamiliar (Korean) language to native English speakers while recording functional magnetic resonance imaging. This allowed us to dissociate the contribution of acoustic vs. linguistic processes toward phoneme analysis. We show that (i) the acoustic analysis of phonemes is modulated by linguistic analysis and (ii) that for this modulation, both of acoustic and phonetic information need to be incorporated. These results suggest that the linguistic modulation of cortical sensitivity to phoneme classes minimizes prediction error during natural speech perception, thereby aiding speech comprehension in challenging listening situations.


Brain Mapping , Magnetic Resonance Imaging , Phonetics , Speech Perception , Humans , Speech Perception/physiology , Female , Magnetic Resonance Imaging/methods , Male , Adult , Young Adult , Linguistics , Acoustic Stimulation/methods , Comprehension/physiology , Brain/physiology , Brain/diagnostic imaging
14.
Cognition ; 248: 105793, 2024 Jul.
Article En | MEDLINE | ID: mdl-38636164

Speech comprehension is enhanced when preceded (or accompanied) by a congruent rhythmic prime reflecting the metrical sentence structure. Although these phenomena have been described for auditory and motor primes separately, their respective and synergistic contribution has not been addressed. In this experiment, participants performed a speech comprehension task on degraded speech signals that were preceded by a rhythmic prime that could be auditory, motor or audiomotor. Both auditory and audiomotor rhythmic primes facilitated speech comprehension speed. While the presence of a purely motor prime (unpaced tapping) did not globally benefit speech comprehension, comprehension accuracy scaled with the regularity of motor tapping. In order to investigate inter-individual variability, participants also performed a Spontaneous Speech Synchronization test. The strength of the estimated perception-production coupling correlated positively with overall speech comprehension scores. These findings are discussed in the framework of the dynamic attending and active sensing theories.


Comprehension , Speech Perception , Humans , Speech Perception/physiology , Male , Female , Young Adult , Comprehension/physiology , Adult , Acoustic Stimulation , Psychomotor Performance/physiology , Auditory Perception/physiology , Speech/physiology
15.
Brain Lang ; 252: 105413, 2024 May.
Article En | MEDLINE | ID: mdl-38608511

Sign languages (SLs) are expressed through different bodily actions, ranging from re-enactment of physical events (constructed action, CA) to sequences of lexical signs with internal structure (plain telling, PT). Despite the prevalence of CA in signed interactions and its significance for SL comprehension, its neural dynamics remain unexplored. We examined the processing of different types of CA (subtle, reduced, and overt) and PT in 35 adult deaf or hearing native signers. The electroencephalographic-based processing of signed sentences with incongruent targets was recorded. Attenuated N300 and early N400 were observed for CA in deaf but not in hearing signers. No differences were found between sentences with CA types in all signers, suggesting a continuum from PT to overt CA. Deaf signers focused more on body movements; hearing signers on faces. We conclude that CA is processed less effortlessly than PT, arguably because of its strong focus on bodily actions.


Comprehension , Deafness , Electroencephalography , Sign Language , Humans , Comprehension/physiology , Adult , Male , Female , Deafness/physiopathology , Young Adult , Brain/physiology , Evoked Potentials/physiology
16.
Neuroreport ; 35(9): 584-589, 2024 Jun 05.
Article En | MEDLINE | ID: mdl-38687896

OBJECTIVE: This study examined the effect of context on the prediction of emotional words with varying valences. It investigated the neural mechanisms underlying the processing differences of emotion words with different valences in both predictable and unpredictable contexts. Additionally, it aimed to address the conflicting results regarding the processing time in predictive contexts reported in previous studies. METHODS: Participants were instructed to carefully read the text that included the specified emotion words. Event-related potentials elicited by emotional words were measured. To ensure that the participants can read the text carefully, 33% of the texts are followed by comprehension problems. After reading the text, the comprehension questions were answered based on the text content. RESULTS: The study revealed that the N400 amplitude elicited by an unpredictable context was greater than that elicited by a predictable context. Additionally, the N400 amplitude triggered by positive emotion words was larger than that triggered by negative emotion words. However, there was no significant difference in late positive component amplitude observed between contextual prediction and emotional word valence. CONCLUSION: The present study suggests that predictive processing takes place at an intermediate stage of speech processing, approximately 400 ms after stimulus onset. Furthermore, the presence of a predictive context enhances the processing of emotional information. Notably, brain activity is more pronounced during the processing of positive emotional stimuli compared to negative emotional stimuli. Additionally, the facilitative effect of a predictable context diminishes in the advanced phase of Chinese speech comprehension.


Electroencephalography , Emotions , Evoked Potentials , Reading , Humans , Emotions/physiology , Evoked Potentials/physiology , Female , Male , Young Adult , Brain/physiology , Adult , Comprehension/physiology
17.
Neuropsychologia ; 198: 108881, 2024 Jun 06.
Article En | MEDLINE | ID: mdl-38579906

As emoji often appear naturally alongside text in utterances, they provide a way to study how prediction unfolds in multimodal sentences in direct comparison to unimodal sentences. In this experiment, participants (N = 40) read sentences in which the sentence-final noun appeared in either word form or emoji form, a between-subjects manipulation. The experiment featured both high constraint sentences and low constraint sentences to examine how the lexical processing of emoji interacts with prediction processes in sentence comprehension. Two well-established ERP components linked to lexical processing and prediction - the N400 and the Late Frontal Positivity - are investigated for sentence-final words and emoji to assess whether, to what extent, and in what linguistic contexts emoji are processed like words. Results indicate that the expected effects, namely an N400 effect to an implausible lexical item compared to a plausible one and an LFP effect to an unexpected lexical item compared to an expected one, emerged for both words and emoji. This paper discusses the similarities and differences between the stimulus types and constraint conditions, contextualized within theories of linguistic prediction, ERP components, and a multimodal lexicon.


Comprehension , Electroencephalography , Evoked Potentials , Reading , Humans , Male , Female , Evoked Potentials/physiology , Young Adult , Comprehension/physiology , Adult , Brain/physiology , Semantics , Adolescent , Psycholinguistics , Emotions/physiology
18.
Res Dev Disabil ; 148: 104711, 2024 May.
Article En | MEDLINE | ID: mdl-38520885

BACKGROUND: Studies on late talkers (LTs) highlighted their heterogeneity and the relevance of describing different communicative profiles. AIMS: To examine lexical skills and gesture use in expressive (E-LTs) vs. receptive-expressive (R/E-LTs) LTs through a structured task. METHODS AND PROCEDURES: Forty-six 30-month-old screened LTs were distinguished into E-LTs (n= 35) and R/E-LTs (n= 11) according to their receptive skills. Lexical skills and gesture use were assessed with a Picture Naming Game by coding answer accuracy (correct, incorrect, no response), modality of expression (spoken, spoken-gestural, gestural), type of gestures (deictic, representational), and spoken-gestural answers' semantic relationship (complementary, equivalent, supplementary). OUTCOMES AND RESULTS: R/E-LTs showed lower scores than E-LTs for noun and predicate comprehension with fewer correct answers, and production with fewer correct and incorrect answers, and more no responses. R/E-LTs also exhibited lower scores in spoken answers, representational gestures, and equivalent spoken-gestural answers for noun production and in all spoken and gestural answers for predicate production. CONCLUSIONS AND IMPLICATIONS: Findings highlighted more impaired receptive and expressive lexical skills and lower gesture use in R/E-LTs compared to E-LTs, underlying the relevance of assessing both lexical and gestural skills through a structured task, besides parental questionnaires and developmental scales, to describe LTs' communicative profiles.


Gestures , Language Development Disorders , Humans , Comprehension/physiology , Parents , Language Tests , Vocabulary
19.
J Exp Psychol Gen ; 153(5): 1407-1413, 2024 May.
Article En | MEDLINE | ID: mdl-38451701

During conversations, people face a trade-off between establishing understanding and making interesting and unique contributions. How do people balance this when deciding which concepts to reference, and does it matter how well they know their conversation partner? In the present work, participants made stream-of-consciousness word associations either with a partner or alone-simplified versions of dialogue and monologue. Participants made semantically narrower and more predictable word associations with a stranger than alone (Study 1), suggesting that they constrain their associations to establish mutual understanding. Increasing closeness (Study 2) or having a prior relationship (Study 3) did not moderate this effect. Thus, even during a task that does not depend on establishing mutual understanding, people sacrifice being interesting for the sake of being understood. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Friends , Semantics , Humans , Male , Female , Adult , Friends/psychology , Young Adult , Comprehension/physiology , Interpersonal Relations , Communication , Association
20.
eNeuro ; 11(4)2024 Apr.
Article En | MEDLINE | ID: mdl-38490743

Research into the role of brain oscillations in basic perceptual and cognitive functions has suggested that the alpha rhythm reflects functional inhibition while the beta rhythm reflects neural ensemble (re)activation. However, little is known regarding the generalization of these proposed fundamental operations to linguistic processes, such as speech comprehension and production. Here, we recorded magnetoencephalography in participants performing a novel rule-switching paradigm. Specifically, Dutch native speakers had to produce an alternative exemplar from the same category or a feature of a given target word embedded in spoken sentences (e.g., for the word "tuna", an exemplar from the same category-"seafood"-would be "shrimp", and a feature would be "pink"). A cue indicated the task rule-exemplar or feature-either before (pre-cue) or after (retro-cue) listening to the sentence. Alpha power during the working memory delay was lower for retro-cue compared with that for pre-cue in the left hemispheric language-related regions. Critically, alpha power negatively correlated with reaction times, suggestive of alpha facilitating task performance by regulating inhibition in regions linked to lexical retrieval. Furthermore, we observed a different spatiotemporal pattern of beta activity for exemplars versus features in the right temporoparietal regions, in line with the proposed role of beta in recruiting neural networks for the encoding of distinct categories. Overall, our study provides evidence for the generalizability of the role of alpha and beta oscillations from perceptual to more "complex, linguistic processes" and offers a novel task to investigate links between rule-switching, working memory, and word production.


Brain , Language , Humans , Brain/physiology , Magnetoencephalography , Comprehension/physiology , Linguistics , Alpha Rhythm/physiology
...