Your browser doesn't support javascript.
loading
Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age.
Suess, Nina; Hauswald, Anne; Reisinger, Patrick; Rösch, Sebastian; Keitel, Anne; Weisz, Nathan.
  • Suess N; Department of Psychology, Centre for Cognitive Neuroscience, University of Salzburg, Salzburg 5020, Austria.
  • Hauswald A; Department of Psychology, Centre for Cognitive Neuroscience, University of Salzburg, Salzburg 5020, Austria.
  • Reisinger P; Department of Psychology, Centre for Cognitive Neuroscience, University of Salzburg, Salzburg 5020, Austria.
  • Rösch S; Department of Otorhinolaryngology, Head and Neck Surgery, Paracelsus Medical University Salzburg, University Hospital Salzburg, Salzburg 5020, Austria.
  • Keitel A; School of Social Sciences, University of Dundee, Dundee DD1 4HN, UK.
  • Weisz N; Department of Psychology, Centre for Cognitive Neuroscience, University of Salzburg, Salzburg 5020, Austria.
Cereb Cortex ; 32(21): 4818-4833, 2022 10 20.
Article en En | MEDLINE | ID: mdl-35062025
ABSTRACT
The integration of visual and auditory cues is crucial for successful processing of speech, especially under adverse conditions. Recent reports have shown that when participants watch muted videos of speakers, the phonological information about the acoustic speech envelope, which is associated with but independent from the speakers' lip movements, is tracked by the visual cortex. However, the speech signal also carries richer acoustic details, for example, about the fundamental frequency and the resonant frequencies, whose visuophonological transformation could aid speech processing. Here, we investigated the neural basis of the visuo-phonological transformation processes of these more fine-grained acoustic details and assessed how they change as a function of age. We recorded whole-head magnetoencephalographic (MEG) data while the participants watched silent normal (i.e., natural) and reversed videos of a speaker and paid attention to their lip movements. We found that the visual cortex is able to track the unheard natural modulations of resonant frequencies (or formants) and the pitch (or fundamental frequency) linked to lip movements. Importantly, only the processing of natural unheard formants decreases significantly with age in the visual and also in the cingulate cortex. This is not the case for the processing of the unheard speech envelope, the fundamental frequency, or the purely visual information carried by lip movements. These results show that unheard spectral fine details (along with the unheard acoustic envelope) are transformed from a mere visual to a phonological representation. Aging affects especially the ability to derive spectral dynamics at formant frequencies. As listening in noisy environments should capitalize on the ability to track spectral fine details, our results provide a novel focus on compensatory processes in such challenging situations.
Asunto(s)
Palabras clave

Texto completo: 1 Banco de datos: MEDLINE Asunto principal: Percepción del Habla Límite: Humans Idioma: En Año: 2022 Tipo del documento: Article

Texto completo: 1 Banco de datos: MEDLINE Asunto principal: Percepción del Habla Límite: Humans Idioma: En Año: 2022 Tipo del documento: Article