Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Sci Rep ; 12(1): 997, 2022 01 19.
Artigo em Inglês | MEDLINE | ID: mdl-35046506

RESUMO

Mental imagery is an important tool in the cognitive control of emotion. The present study tests the prediction that visual imagery can generate and regulate differential fear conditioning via the activation and prioritization of stimulus representations in early visual cortices. We combined differential fear conditioning with manipulations of viewing and imagining basic visual stimuli in humans. We discovered that mental imagery of a fear-conditioned stimulus compared to imagery of a safe conditioned stimulus generated a significantly greater conditioned response as measured by self-reported fear, the skin conductance response, and right anterior insula activity (experiment 1). Moreover, mental imagery effectively down- and up-regulated the fear conditioned responses (experiment 2). Multivariate classification using the functional magnetic resonance imaging data from retinotopically defined early visual regions revealed significant decoding of the imagined stimuli in V2 and V3 (experiment 1) but significantly reduced decoding in these regions during imagery-based regulation (experiment 2). Together, the present findings indicate that mental imagery can generate and regulate a differential fear conditioned response via mechanisms of the depictive theory of imagery and the biased-competition theory of attention. These findings also highlight the potential importance of mental imagery in the manifestation and treatment of psychological illnesses.


Assuntos
Condicionamento Clássico , Medo/psicologia , Imaginação , Adulto , Estimulação Elétrica , Feminino , Resposta Galvânica da Pele , Humanos , Imageamento por Ressonância Magnética/métodos , Masculino
2.
Neurosci Conscious ; 2020(1): niaa014, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32793393

RESUMO

It has been established that lip reading improves the perception of auditory speech. But does seeing objects themselves help us hear better the sounds they make? Here we report a series of psychophysical experiments in humans showing that the visual enhancement of auditory sensitivity is not confined to speech. We further show that the crossmodal enhancement was associated with the conscious visualization of the stimulus: we can better hear the sounds an object makes when we are conscious of seeing that object. Our work extends an intriguing crossmodal effect, previously circumscribed to speech, to a wider domain of real-world objects, and suggests that consciousness contributes to this effect.

3.
Neuroimage ; 174: 1-10, 2018 07 01.
Artigo em Inglês | MEDLINE | ID: mdl-29501874

RESUMO

Effective social functioning relies in part on the ability to identify emotions from auditory stimuli and respond appropriately. Previous studies have uncovered brain regions engaged by the affective information conveyed by sound. But some of the acoustical properties of sounds that express certain emotions vary remarkably with the instrument used to produce them, for example the human voice or a violin. Do these brain regions respond in the same way to different emotions regardless of the sound source? To address this question, we had participants (N = 38, 20 females) listen to brief audio excerpts produced by the violin, clarinet, and human voice, each conveying one of three target emotions-happiness, sadness, and fear-while brain activity was measured with fMRI. We used multivoxel pattern analysis to test whether emotion-specific neural responses to the voice could predict emotion-specific neural responses to musical instruments and vice-versa. A whole-brain searchlight analysis revealed that patterns of activity within the primary and secondary auditory cortex, posterior insula, and parietal operculum were predictive of the affective content of sound both within and across instruments. Furthermore, classification accuracy within the anterior insula was correlated with behavioral measures of empathy. The findings suggest that these brain regions carry emotion-specific patterns that generalize across sounds with different acoustical properties. Also, individuals with greater empathic ability have more distinct neural patterns related to perceiving emotions. These results extend previous knowledge regarding how the human brain extracts emotional meaning from auditory stimuli and enables us to understand and connect with others effectively.


Assuntos
Córtex Auditivo/fisiologia , Percepção Auditiva/fisiologia , Emoções/fisiologia , Estimulação Acústica , Adolescente , Adulto , Afeto/fisiologia , Encéfalo/fisiologia , Mapeamento Encefálico , Feminino , Humanos , Imageamento por Ressonância Magnética , Masculino , Adulto Jovem
4.
J Neurosci ; 32(47): 16629-36, 2012 Nov 21.
Artigo em Inglês | MEDLINE | ID: mdl-23175818

RESUMO

People can identify objects in the environment with remarkable accuracy, regardless of the sensory modality they use to perceive them. This suggests that information from different sensory channels converges somewhere in the brain to form modality-invariant representations, i.e., representations that reflect an object independently of the modality through which it has been apprehended. In this functional magnetic resonance imaging study of human subjects, we first identified brain areas that responded to both visual and auditory stimuli and then used crossmodal multivariate pattern analysis to evaluate the neural representations in these regions for content specificity (i.e., do different objects evoke different representations?) and modality invariance (i.e., do the sight and the sound of the same object evoke a similar representation?). While several areas became activated in response to both auditory and visual stimulation, only the neural patterns recorded in a region around the posterior part of the superior temporal sulcus displayed both content specificity and modality invariance. This region thus appears to play an important role in our ability to recognize objects in our surroundings through multiple sensory channels and to process them at a supramodal (i.e., conceptual) level.


Assuntos
Percepção Auditiva/fisiologia , Córtex Cerebral/fisiologia , Lobo Parietal/fisiologia , Lobo Temporal/fisiologia , Percepção Visual/fisiologia , Estimulação Acústica , Análise de Variância , Mapeamento Encefálico , Feminino , Humanos , Processamento de Imagem Assistida por Computador , Imaginação/fisiologia , Imageamento por Ressonância Magnética , Masculino , Reconhecimento Automatizado de Padrão , Estimulação Luminosa
5.
Soc Cogn Affect Neurosci ; 3(3): 218-23, 2008 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-19015113

RESUMO

There is evidence that the right hemisphere is involved in processing self-related stimuli. Previous brain imaging research has found a network of right-lateralized brain regions that preferentially respond to seeing one's own face rather than a familiar other. Given that the self is an abstract multimodal concept, we tested whether these brain regions would also discriminate the sound of one's own voice compared to a friend's voice. Participants were shown photographs of their own face and friend's face, and also listened to recordings of their own voice and a friend's voice during fMRI scanning. Consistent with previous studies, seeing one's own face activated regions in the inferior frontal gyrus (IFG), inferior parietal lobe and inferior occipital cortex in the right hemisphere. In addition, listening to one's voice also showed increased activity in the right IFG. These data suggest that the right IFG is concerned with processing self-related stimuli across multiple sensory modalities and that it may contribute to an abstract self-representation.


Assuntos
Formação de Conceito , Lobo Frontal/fisiologia , Lateralidade Funcional/fisiologia , Reconhecimento Psicológico/fisiologia , Autoimagem , Voz , Estimulação Acústica , Adulto , Córtex Cerebral/fisiologia , Feminino , Humanos , Imageamento por Ressonância Magnética , Masculino , Valores de Referência , Percepção da Fala/fisiologia , Percepção Visual/fisiologia , Adulto Jovem
6.
Cogn Process ; 8(2): 103-13, 2007 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-17503101

RESUMO

We used functional magnetic resonance imaging (fMRI) to investigate the neural systems responding to the sight and to the sound of an action. Subjects saw a video of paper tearing in silence (V), heard the sound of paper tearing (A), and saw and heard the action simultaneously (A + V). Compared to a non-action control stimulus, we found that hearing action sounds (A) activated the anterior inferior frontal gyrus and middle frontal gyrus in addition to primary auditory cortex. The anterior inferior frontal gyrus, which is known to be activated by environmental sounds, also seems to be involved in recognizing actions by sound. Consistent with previous research, seeing an action video (V) compared with seeing a non-action video activated the premotor cortex, intraparietal cortex, and the pars opercularis of the inferior frontal gyrus. An A + V facilitation effect was found in the ventral premotor cortex on the border of areas 44, 6, 3a, and 3b for the action stimuli but not for the control stimuli. This region may be involved in integrating multimodal information about actions. These data provide evidence that the ventral premotor cortex may provide an action representation that abstracts across both agency (self and other) and sensory modality (hearing and seeing). This function may be an important precursor of language functions.


Assuntos
Mapeamento Encefálico , Lateralidade Funcional/fisiologia , Processos Mentais/fisiologia , Córtex Motor/fisiologia , Desempenho Psicomotor/fisiologia , Estimulação Acústica/métodos , Adulto , Percepção Auditiva/fisiologia , Feminino , Humanos , Processamento de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética , Masculino , Córtex Motor/irrigação sanguínea , Oxigênio/sangue , Estimulação Luminosa/métodos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA