EEG-based auditory attention decoding with audiovisual speech for hearing-impaired listeners.
Cereb Cortex
; 33(22): 10972-10983, 2023 11 04.
Article
en En
| MEDLINE
| ID: mdl-37750333
Auditory attention decoding (AAD) was used to determine the attended speaker during an auditory selective attention task. However, the auditory factors modulating AAD remained unclear for hearing-impaired (HI) listeners. In this study, scalp electroencephalogram (EEG) was recorded with an auditory selective attention paradigm, in which HI listeners were instructed to attend one of the two simultaneous speech streams with or without congruent visual input (articulation movements), and at a high or low target-to-masker ratio (TMR). Meanwhile, behavioral hearing tests (i.e. audiogram, speech reception threshold, temporal modulation transfer function) were used to assess listeners' individual auditory abilities. The results showed that both visual input and increasing TMR could significantly enhance the cortical tracking of the attended speech and AAD accuracy. Further analysis revealed that the audiovisual (AV) gain in attended speech cortical tracking was significantly correlated with listeners' auditory amplitude modulation (AM) sensitivity, and the TMR gain in attended speech cortical tracking was significantly correlated with listeners' hearing thresholds. Temporal response function analysis revealed that subjects with higher AM sensitivity demonstrated more AV gain over the right occipitotemporal and bilateral frontocentral scalp electrodes.
Palabras clave
Texto completo:
1
Banco de datos:
MEDLINE
Asunto principal:
Percepción del Habla
/
Pérdida Auditiva
Límite:
Humans
Idioma:
En
Revista:
Cereb Cortex
Asunto de la revista:
CEREBRO
Año:
2023
Tipo del documento:
Article
País de afiliación:
China