Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Assunto da revista
País de afiliação
Intervalo de ano de publicação
1.
J Neurosci ; 30(41): 13552-7, 2010 Oct 13.
Artigo em Inglês | MEDLINE | ID: mdl-20943896

RESUMO

Our ability to recognize the emotions of others is a crucial feature of human social cognition. Functional neuroimaging studies indicate that activity in sensorimotor cortices is evoked during the perception of emotion. In the visual domain, right somatosensory cortex activity has been shown to be critical for facial emotion recognition. However, the importance of sensorimotor representations in modalities outside of vision remains unknown. Here we use continuous theta-burst transcranial magnetic stimulation (cTBS) to investigate whether neural activity in the right postcentral gyrus (rPoG) and right lateral premotor cortex (rPM) is involved in nonverbal auditory emotion recognition. Three groups of participants completed same-different tasks on auditory stimuli, discriminating between the emotion expressed and the speakers' identities, before and following cTBS targeted at rPoG, rPM, or the vertex (control site). A task-selective deficit in auditory emotion discrimination was observed. Stimulation to rPoG and rPM resulted in a disruption of participants' abilities to discriminate emotion, but not identity, from vocal signals. These findings suggest that sensorimotor activity may be a modality-independent mechanism which aids emotion discrimination.


Assuntos
Percepção Auditiva/fisiologia , Córtex Cerebral/fisiologia , Discriminação Psicológica/fisiologia , Emoções/fisiologia , Reconhecimento Psicológico/fisiologia , Estimulação Acústica , Adulto , Análise de Variância , Mapeamento Encefálico , Feminino , Lateralidade Funcional/fisiologia , Humanos , Processamento de Imagem Assistida por Computador , Masculino , Fatores de Tempo , Estimulação Magnética Transcraniana/métodos
2.
J Cogn Neurosci ; 22(3): 474-81, 2010 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-19302002

RESUMO

The rapid detection of affective signals from conspecifics is crucial for the survival of humans and other animals; if those around you are scared, there is reason for you to be alert and to prepare for impending danger. Previous research has shown that the human brain detects emotional faces within 150 msec of exposure, indicating a rapid differentiation of visual social signals based on emotional content. Here we use event-related brain potential (ERP) measures to show for the first time that this mechanism extends to the auditory domain, using human nonverbal vocalizations, such as screams. An early fronto-central positivity to fearful vocalizations compared with spectrally rotated and thus acoustically matched versions of the same sounds started 150 msec after stimulus onset. This effect was also observed for other vocalized emotions (achievement and disgust), but not for affectively neutral vocalizations, and was linked to the perceived arousal of an emotion category. That the timing, polarity, and scalp distribution of this new ERP correlate are similar to ERP markers of emotional face processing suggests that common supramodal brain mechanisms may be involved in the rapid detection of affectively relevant visual and auditory signals.


Assuntos
Córtex Cerebral/fisiologia , Emoções/fisiologia , Potenciais Evocados/fisiologia , Comunicação não Verbal/fisiologia , Adulto , Percepção Auditiva , Mapeamento Encefálico , Feminino , Humanos , Masculino , Comunicação não Verbal/psicologia , Tempo de Reação
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA