Neural processing of speech comprehension in noise predicts individual age using fNIRS-based brain-behavior models.
Cereb Cortex
; 34(5)2024 May 02.
Article
em En
| MEDLINE
| ID: mdl-38715408
ABSTRACT
Speech comprehension in noise depends on complex interactions between peripheral sensory and central cognitive systems. Despite having normal peripheral hearing, older adults show difficulties in speech comprehension. It remains unclear whether the brain's neural responses could indicate aging. The current study examined whether individual brain activation during speech perception in different listening environments could predict age. We applied functional near-infrared spectroscopy to 93 normal-hearing human adults (20 to 70 years old) during a sentence listening task, which contained a quiet condition and 4 different signal-to-noise ratios (SNR = 10, 5, 0, -5 dB) noisy conditions. A data-driven approach, the region-based brain-age predictive modeling was adopted. We observed a significant behavioral decrease with age under the 4 noisy conditions, but not under the quiet condition. Brain activations in SNR = 10 dB listening condition could successfully predict individual's age. Moreover, we found that the bilateral visual sensory cortex, left dorsal speech pathway, left cerebellum, right temporal-parietal junction area, right homolog Wernicke's area, and right middle temporal gyrus contributed most to prediction performance. These results demonstrate that the activations of regions about sensory-motor mapping of sound, especially in noisy conditions, could be sensitive measures for age prediction than external behavior measures.
Palavras-chave
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Percepção da Fala
/
Encéfalo
/
Envelhecimento
/
Espectroscopia de Luz Próxima ao Infravermelho
/
Compreensão
/
Ruído
Limite:
Adult
/
Aged
/
Female
/
Humans
/
Male
/
Middle aged
Idioma:
En
Revista:
Cereb Cortex
Assunto da revista:
CEREBRO
Ano de publicação:
2024
Tipo de documento:
Article