The cortical representation of language timescales is shared between reading and listening.
Commun Biol
; 7(1): 284, 2024 Mar 07.
Article
en En
| MEDLINE
| ID: mdl-38454134
ABSTRACT
Language comprehension involves integrating low-level sensory inputs into a hierarchy of increasingly high-level features. Prior work studied brain representations of different levels of the language hierarchy, but has not determined whether these brain representations are shared between written and spoken language. To address this issue, we analyze fMRI BOLD data that were recorded while participants read and listened to the same narratives in each modality. Levels of the language hierarchy are operationalized as timescales, where each timescale refers to a set of spectral components of a language stimulus. Voxelwise encoding models are used to determine where different timescales are represented across the cerebral cortex, for each modality separately. These models reveal that between the two modalities timescale representations are organized similarly across the cortical surface. Our results suggest that, after low-level sensory processing, language integration proceeds similarly regardless of stimulus modality.
Texto completo:
1
Colección:
01-internacional
Base de datos:
MEDLINE
Asunto principal:
Lectura
/
Lenguaje
Límite:
Humans
Idioma:
En
Revista:
Commun Biol
Año:
2024
Tipo del documento:
Article
País de afiliación:
Estados Unidos
Pais de publicación:
Reino Unido