Your browser doesn't support javascript.
loading
A Visual Cortical Network for Deriving Phonological Information from Intelligible Lip Movements.
Hauswald, Anne; Lithari, Chrysa; Collignon, Olivier; Leonardelli, Elisa; Weisz, Nathan.
Afiliação
  • Hauswald A; Centre for Cognitive Neurosciences, University of Salzburg, Salzburg 5020, Austria; CIMeC, Center for Mind/Brain Sciences, Università degli studi di Trento, Trento 38123, Italy. Electronic address: anne.hauswald@sbg.ac.at.
  • Lithari C; Centre for Cognitive Neurosciences, University of Salzburg, Salzburg 5020, Austria; CIMeC, Center for Mind/Brain Sciences, Università degli studi di Trento, Trento 38123, Italy.
  • Collignon O; CIMeC, Center for Mind/Brain Sciences, Università degli studi di Trento, Trento 38123, Italy; Institute of Research in Psychology & Institute of NeuroScience, Université catholique de Louvain, Louvain 1348, Belgium.
  • Leonardelli E; CIMeC, Center for Mind/Brain Sciences, Università degli studi di Trento, Trento 38123, Italy.
  • Weisz N; Centre for Cognitive Neurosciences, University of Salzburg, Salzburg 5020, Austria; CIMeC, Center for Mind/Brain Sciences, Università degli studi di Trento, Trento 38123, Italy. Electronic address: nathan.weisz@sbg.ac.at.
Curr Biol ; 28(9): 1453-1459.e3, 2018 05 07.
Article em En | MEDLINE | ID: mdl-29681475
Successful lip-reading requires a mapping from visual to phonological information [1]. Recently, visual and motor cortices have been implicated in tracking lip movements (e.g., [2]). It remains unclear, however, whether visuo-phonological mapping occurs already at the level of the visual cortex-that is, whether this structure tracks the acoustic signal in a functionally relevant manner. To elucidate this, we investigated how the cortex tracks (i.e., entrains to) absent acoustic speech signals carried by silent lip movements. Crucially, we contrasted the entrainment to unheard forward (intelligible) and backward (unintelligible) acoustic speech. We observed that the visual cortex exhibited stronger entrainment to the unheard forward acoustic speech envelope compared to the unheard backward acoustic speech envelope. Supporting the notion of a visuo-phonological mapping process, this forward-backward difference of occipital entrainment was not present for actually observed lip movements. Importantly, the respective occipital region received more top-down input, especially from left premotor, primary motor, and somatosensory regions and, to a lesser extent, also from posterior temporal cortex. Strikingly, across participants, the extent of top-down modulation of the visual cortex stemming from these regions partially correlated with the strength of entrainment to absent acoustic forward speech envelope, but not to present forward lip movements. Our findings demonstrate that a distributed cortical network, including key dorsal stream auditory regions [3-5], influences how the visual cortex shows sensitivity to the intelligibility of speech while tracking silent lip movements.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Fala / Percepção da Fala / Córtex Visual Limite: Adult / Female / Humans / Male Idioma: En Ano de publicação: 2018 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Fala / Percepção da Fala / Córtex Visual Limite: Adult / Female / Humans / Male Idioma: En Ano de publicação: 2018 Tipo de documento: Article