Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters

Database
Language
Affiliation country
Publication year range
1.
Atten Percept Psychophys ; 78(2): 583-601, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26669309

ABSTRACT

Recent influential models of audiovisual speech perception suggest that visual speech aids perception by generating predictions about the identity of upcoming speech sounds. These models place stock in the assumption that visual speech leads auditory speech in time. However, it is unclear whether and to what extent temporally-leading visual speech information contributes to perception. Previous studies exploring audiovisual-speech timing have relied upon psychophysical procedures that require artificial manipulation of cross-modal alignment or stimulus duration. We introduce a classification procedure that tracks perceptually relevant visual speech information in time without requiring such manipulations. Participants were shown videos of a McGurk syllable (auditory /apa/ + visual /aka/ = perceptual /ata/) and asked to perform phoneme identification (/apa/ yes-no). The mouth region of the visual stimulus was overlaid with a dynamic transparency mask that obscured visual speech in some frames but not others randomly across trials. Variability in participants' responses (~35 % identification of /apa/ compared to ~5 % in the absence of the masker) served as the basis for classification analysis. The outcome was a high resolution spatiotemporal map of perceptually relevant visual features. We produced these maps for McGurk stimuli at different audiovisual temporal offsets (natural timing, 50-ms visual lead, and 100-ms visual lead). Briefly, temporally-leading (~130 ms) visual information did influence auditory perception. Moreover, several visual features influenced perception of a single speech sound, with the relative influence of each feature depending on both its temporal relation to the auditory signal and its informational content.


Subject(s)
Acoustic Stimulation/methods , Auditory Perception/physiology , Photic Stimulation/methods , Speech Perception/physiology , Statistics as Topic/methods , Visual Perception/physiology , Adult , Female , Humans , Male , Phonetics , Psychophysics , Speech/physiology , Time Factors
2.
J Acoust Soc Am ; 138(1): 279-83, 2015 Jul.
Article in English | MEDLINE | ID: mdl-26233027

ABSTRACT

Long-term loudness perception of a sound has been presumed to depend on the spatial distribution of activated auditory nerve fibers as well as their temporal firing pattern. The relative contributions of those two factors were investigated by measuring loudness adaptation to sinusoidally amplitude-modulated 12-kHz tones. The tones had a total duration of 180 s and were either unmodulated or 100%-modulated at one of three frequencies (4, 20, or 100 Hz), and additionally varied in modulation depth from 0% to 100% at the 4-Hz frequency only. Every 30 s, normal-hearing subjects estimated the loudness of one of the stimuli played at 15 dB above threshold in random order. Without any amplitude modulation, the loudness of the unmodulated tone after 180 s was only 20% of the loudness at the onset of the stimulus. Amplitude modulation systematically reduced the amount of loudness adaptation, with the 100%-modulated stimuli, regardless of modulation frequency, maintaining on average 55%-80% of the loudness at onset after 180 s. Because the present low-frequency amplitude modulation produced minimal changes in long-term spectral cues affecting the spatial distribution of excitation produced by a 12-kHz pure tone, the present result indicates that neural synchronization is critical to maintaining loudness perception over time.


Subject(s)
Adaptation, Physiological , Auditory Pathways/physiology , Loudness Perception/physiology , Acoustic Stimulation , Adult , Auditory Threshold/physiology , Female , Habituation, Psychophysiologic , Humans , Male , Nerve Fibers/physiology , Pitch Perception/physiology , Psychoacoustics , Sound
SELECTION OF CITATIONS
SEARCH DETAIL