Your browser doesn't support javascript.
loading
Auditory-Articulatory Neural Alignment between Listener and Speaker during Verbal Communication.
Liu, Lanfang; Zhang, Yuxuan; Zhou, Qi; Garrett, Douglas D; Lu, Chunming; Chen, Antao; Qiu, Jiang; Ding, Guosheng.
Afiliação
  • Liu L; State Key Laboratory of Cognitive Neuroscience and Learning, IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, People's Republic of China.
  • Zhang Y; Department of Psychology, Sun Yat-sen University, Guangzhou 510006, People's Republic of China.
  • Zhou Q; State Key Laboratory of Cognitive Neuroscience and Learning, IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, People's Republic of China.
  • Garrett DD; State Key Laboratory of Cognitive Neuroscience and Learning, IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, People's Republic of China.
  • Lu C; Max Planck UCL Centre for Computational Psychiatry and Ageing Research, Max Planck Institute for Human Development, Lentzeallee 94, Berlin 14195, Germany.
  • Chen A; State Key Laboratory of Cognitive Neuroscience and Learning, IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, People's Republic of China.
  • Qiu J; Key Laboratory of Cognition and Personality (SWU), Ministry of Education & Department of Psychology, Southwest University, Chongqing 400715, People's Republic of China.
  • Ding G; Key Laboratory of Cognition and Personality (SWU), Ministry of Education & Department of Psychology, Southwest University, Chongqing 400715, People's Republic of China.
Cereb Cortex ; 30(3): 942-951, 2020 03 14.
Article em En | MEDLINE | ID: mdl-31318013
ABSTRACT
Whether auditory processing of speech relies on reference to the articulatory motor information of speaker remains elusive. Here, we addressed this issue under a two-brain framework. Functional magnetic resonance imaging was applied to record the brain activities of speakers when telling real-life stories and later of listeners when listening to the audio recordings of these stories. Based on between-brain seed-to-voxel correlation analyses, we revealed that neural dynamics in listeners' auditory temporal cortex are temporally coupled with the dynamics in the speaker's larynx/phonation area. Moreover, the coupling response in listener's left auditory temporal cortex follows the hierarchical organization for speech processing, with response lags in A1+, STG/STS, and MTG increasing linearly. Further, listeners showing greater coupling responses understand the speech better. When comprehension fails, such interbrain auditory-articulation coupling vanishes substantially. These findings suggest that a listener's auditory system and a speaker's articulatory system are inherently aligned during naturalistic verbal interaction, and such alignment is associated with high-level information transfer from the speaker to the listener. Our study provides reliable evidence supporting that references to the articulatory motor information of speaker facilitate speech comprehension under a naturalistic scene.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Fala / Percepção da Fala / Lobo Temporal Limite: Adult / Aged / Female / Humans / Male / Middle aged Idioma: En Ano de publicação: 2020 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Fala / Percepção da Fala / Lobo Temporal Limite: Adult / Aged / Female / Humans / Male / Middle aged Idioma: En Ano de publicação: 2020 Tipo de documento: Article