Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros

Bases de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Eur J Neurosci ; 51(5): 1234-1241, 2020 03.
Artículo en Inglés | MEDLINE | ID: mdl-29205588

RESUMEN

Previous research has shown that it is possible to predict which speaker is attended in a multispeaker scene by analyzing a listener's electroencephalography (EEG) activity. In this study, existing linear models that learn the mapping from neural activity to an attended speech envelope are replaced by a non-linear neural network (NN). The proposed architecture takes into account the temporal context of the estimated envelope and is evaluated using EEG data obtained from 20 normal-hearing listeners who focused on one speaker in a two-speaker setting. The network is optimized with respect to the frequency range and the temporal segmentation of the EEG input, as well as the cost function used to estimate the model parameters. To identify the salient cues involved in auditory attention, a relevance algorithm is applied that highlights the electrode signals most important for attention decoding. In contrast to linear approaches, the NN profits from a wider EEG frequency range (1-32 Hz) and achieves a performance seven times higher than the linear baseline. Relevant EEG activations following the speech stimulus after 170 ms at physiologically plausible locations were found. This was not observed when the model was trained on the unattended speaker. Our findings therefore indicate that non-linear NNs can provide insight into physiological processes by analyzing EEG activity.


Asunto(s)
Percepción del Habla , Habla , Estimulación Acústica , Electroencefalografía , Aprendizaje Automático
2.
Int J Audiol ; 57(sup3): S81-S91, 2018 06.
Artículo en Inglés | MEDLINE | ID: mdl-28395561

RESUMEN

OBJECTIVE: To investigate the influence of an algorithm designed to enhance or magnify interaural difference cues on speech signals in noisy, spatially complex conditions using both technical and perceptual measurements. To also investigate the combination of interaural magnification (IM), monaural microphone directionality (DIR), and binaural coherence-based noise reduction (BC). DESIGN: Speech-in-noise stimuli were generated using virtual acoustics. A computational model of binaural hearing was used to analyse the spatial effects of IM. Predicted speech quality changes and signal-to-noise-ratio (SNR) improvements were also considered. Additionally, a listening test was carried out to assess speech intelligibility and quality. STUDY SAMPLE: Listeners aged 65-79 years with and without sensorineural hearing loss (N = 10 each). RESULTS: IM increased the horizontal separation of concurrent directional sound sources without introducing any major artefacts. In situations with diffuse noise, however, the interaural difference cues were distorted. Preprocessing the binaural input signals with DIR reduced distortion. IM influenced neither speech intelligibility nor speech quality. CONCLUSIONS: The IM algorithm tested here failed to improve speech perception in noise, probably because of the dispersion and inconsistent magnification of interaural difference cues in complex environments.


Asunto(s)
Acústica , Corrección de Deficiencia Auditiva/instrumentación , Señales (Psicología) , Audífonos , Pérdida Auditiva Sensorineural/rehabilitación , Audición , Modelos Teóricos , Personas con Deficiencia Auditiva/rehabilitación , Percepción del Habla , Estimulación Acústica , Anciano , Algoritmos , Audiometría de Tonos Puros , Audiometría del Habla , Simulación por Computador , Diseño de Equipo , Femenino , Pérdida Auditiva Sensorineural/diagnóstico , Pérdida Auditiva Sensorineural/fisiopatología , Pérdida Auditiva Sensorineural/psicología , Humanos , Masculino , Ruido/efectos adversos , Enmascaramiento Perceptual , Personas con Deficiencia Auditiva/psicología , Psicoacústica , Procesamiento de Señales Asistido por Computador , Inteligibilidad del Habla
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA