The different brain areas occupied for integrating information of hierarchical linguistic units: a study based on EEG and TMS.
Cereb Cortex
; 33(8): 4740-4751, 2023 04 04.
Article
en En
| MEDLINE
| ID: mdl-36178127
Human language units are hierarchical, and reading acquisition involves integrating multisensory information (typically from auditory and visual modalities) to access meaning. However, it is unclear how the brain processes and integrates language information at different linguistic units (words, phrases, and sentences) provided simultaneously in auditory and visual modalities. To address the issue, we presented participants with sequences of short Chinese sentences through auditory, visual, or combined audio-visual modalities while electroencephalographic responses were recorded. With a frequency tagging approach, we analyzed the neural representations of basic linguistic units (i.e. characters/monosyllabic words) and higher-level linguistic structures (i.e. phrases and sentences) across the 3 modalities separately. We found that audio-visual integration occurs in all linguistic units, and the brain areas involved in the integration varied across different linguistic levels. In particular, the integration of sentences activated the local left prefrontal area. Therefore, we used continuous theta-burst stimulation to verify that the left prefrontal cortex plays a vital role in the audio-visual integration of sentence information. Our findings suggest the advantage of bimodal language comprehension at hierarchical stages in language-related information processing and provide evidence for the causal role of the left prefrontal regions in processing information of audio-visual sentences.
Palabras clave
Texto completo:
1
Bases de datos:
MEDLINE
Asunto principal:
Mapeo Encefálico
/
Comprensión
Límite:
Humans
Idioma:
En
Revista:
Cereb Cortex
Asunto de la revista:
CEREBRO
Año:
2023
Tipo del documento:
Article
País de afiliación:
China