Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
Cereb Cortex ; 30(2): 597-606, 2020 03 21.
Artigo em Inglês | MEDLINE | ID: mdl-31216008

RESUMO

Sounds (e.g., barking) help us to visually identify objects (e.g., a dog) that are distant or ambiguous. While neuroimaging studies have revealed neuroanatomical sites of audiovisual interactions, little is known about the time course by which sounds facilitate visual object processing. Here we used magnetoencephalography to reveal the time course of the facilitatory influence of natural sounds (e.g., barking) on visual object processing and compared this to the facilitatory influence of spoken words (e.g., "dog"). Participants viewed images of blurred objects preceded by a task-irrelevant natural sound, a spoken word, or uninformative noise. A classifier was trained to discriminate multivariate sensor patterns evoked by animate and inanimate intact objects with no sounds, presented in a separate experiment, and tested on sensor patterns evoked by the blurred objects in the 3 auditory conditions. Results revealed that both sounds and words, relative to uninformative noise, significantly facilitated visual object category decoding between 300-500 ms after visual onset. We found no evidence for earlier facilitation by sounds than by words. These findings provide evidence for a semantic route of facilitation by both natural sounds and spoken words, whereby the auditory input first activates semantic object representations, which then modulate the visual processing of objects.


Assuntos
Percepção Auditiva/fisiologia , Sinais (Psicologia) , Reconhecimento Visual de Modelos/fisiologia , Semântica , Percepção da Fala/fisiologia , Estimulação Acústica , Adulto , Feminino , Humanos , Magnetoencefalografia , Masculino , Estimulação Luminosa , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA