Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
J Acoust Soc Am ; 138(1): 279-83, 2015 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-26233027

RESUMO

Long-term loudness perception of a sound has been presumed to depend on the spatial distribution of activated auditory nerve fibers as well as their temporal firing pattern. The relative contributions of those two factors were investigated by measuring loudness adaptation to sinusoidally amplitude-modulated 12-kHz tones. The tones had a total duration of 180 s and were either unmodulated or 100%-modulated at one of three frequencies (4, 20, or 100 Hz), and additionally varied in modulation depth from 0% to 100% at the 4-Hz frequency only. Every 30 s, normal-hearing subjects estimated the loudness of one of the stimuli played at 15 dB above threshold in random order. Without any amplitude modulation, the loudness of the unmodulated tone after 180 s was only 20% of the loudness at the onset of the stimulus. Amplitude modulation systematically reduced the amount of loudness adaptation, with the 100%-modulated stimuli, regardless of modulation frequency, maintaining on average 55%-80% of the loudness at onset after 180 s. Because the present low-frequency amplitude modulation produced minimal changes in long-term spectral cues affecting the spatial distribution of excitation produced by a 12-kHz pure tone, the present result indicates that neural synchronization is critical to maintaining loudness perception over time.


Assuntos
Adaptação Fisiológica , Vias Auditivas/fisiologia , Percepção Sonora/fisiologia , Estimulação Acústica , Adulto , Limiar Auditivo/fisiologia , Feminino , Habituação Psicofisiológica , Humanos , Masculino , Fibras Nervosas/fisiologia , Percepção da Altura Sonora/fisiologia , Psicoacústica , Som
2.
Atten Percept Psychophys ; 78(2): 583-601, 2016 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-26669309

RESUMO

Recent influential models of audiovisual speech perception suggest that visual speech aids perception by generating predictions about the identity of upcoming speech sounds. These models place stock in the assumption that visual speech leads auditory speech in time. However, it is unclear whether and to what extent temporally-leading visual speech information contributes to perception. Previous studies exploring audiovisual-speech timing have relied upon psychophysical procedures that require artificial manipulation of cross-modal alignment or stimulus duration. We introduce a classification procedure that tracks perceptually relevant visual speech information in time without requiring such manipulations. Participants were shown videos of a McGurk syllable (auditory /apa/ + visual /aka/ = perceptual /ata/) and asked to perform phoneme identification (/apa/ yes-no). The mouth region of the visual stimulus was overlaid with a dynamic transparency mask that obscured visual speech in some frames but not others randomly across trials. Variability in participants' responses (~35 % identification of /apa/ compared to ~5 % in the absence of the masker) served as the basis for classification analysis. The outcome was a high resolution spatiotemporal map of perceptually relevant visual features. We produced these maps for McGurk stimuli at different audiovisual temporal offsets (natural timing, 50-ms visual lead, and 100-ms visual lead). Briefly, temporally-leading (~130 ms) visual information did influence auditory perception. Moreover, several visual features influenced perception of a single speech sound, with the relative influence of each feature depending on both its temporal relation to the auditory signal and its informational content.


Assuntos
Estimulação Acústica/métodos , Percepção Auditiva/fisiologia , Estimulação Luminosa/métodos , Percepção da Fala/fisiologia , Estatística como Assunto/métodos , Percepção Visual/fisiologia , Adulto , Feminino , Humanos , Masculino , Fonética , Psicofísica , Fala/fisiologia , Fatores de Tempo
SELEÇÃO DE REFERÊNCIAS
Detalhe da pesquisa