Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros

Base de dados
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
Neuroimage ; 203: 116199, 2019 12.
Artigo em Inglês | MEDLINE | ID: mdl-31536804

RESUMO

Valence is a dimension of emotion and can be either positive, negative, or neutral. Valences can be expressed through the visual and auditory modalities, and the valences of each modality can be conveyed by different types of stimuli (face, body, voice or music). This study focused on the modality-general representations of valences, that is, valence information can be shared across not only visual and auditory modalities but also different types of stimuli within each modality. Functional magnetic resonance imaging (fMRI) data were collected when subjects made affective judgment on silent videos (face and body) and audio clips (voice and music). The searchlight analysis helped to locate four areas that might be sensitive to the representations of modality-general valences, including the bilateral postcentral gyrus, left middle temporal gyrus (MTG) and right middle frontal gyrus (MFG). Further cross-modal classification based on multivoxel pattern analysis (MVPA) was performed as a validation analysis, which suggested that only the left postcentral gyrus could successfully distinguish three valences (positive versus negative and versus neutral: PvsNvs0) across different types of stimuli (face, body, voice or music), and the classification was also successful in left MTG across the stimuli types of face and body. The univariate analysis further found the valence-specific activation differences across stimulus types in MTG. Our study showed that the left postcentral gyrus was informative to valence representations, and extended the research about valence representation that the modality-general representation of valences across not only visual and auditory modalities but also different types of stimuli within each modality.


Assuntos
Percepção Auditiva/fisiologia , Encéfalo/fisiologia , Emoções/fisiologia , Percepção Visual/fisiologia , Estimulação Acústica , Adulto , Mapeamento Encefálico , Feminino , Humanos , Processamento de Imagem Assistida por Computador , Imageamento por Ressonância Magnética , Masculino , Estimulação Luminosa , Adulto Jovem
2.
Neuroscience ; 372: 87-96, 2018 02 21.
Artigo em Inglês | MEDLINE | ID: mdl-29294340

RESUMO

'Significant' objects contribute greatly to scene recognition. The lateral occipital complex (LOC), parahippocampal place area (PPA), and retrosplenial cortex (RSC) play a crucial role in the cognitive processing of objects and scenes. However, the associated mechanism between objects and scenes remains unclear. In this study, four categories of scene images and four types of significant objects were designed as stimuli. Representational similarity analysis (RSA) of functional magnetic resonance imaging (fMRI) data showed that correlation coefficients of the activity patterns for objects and scenes were significantly positive in the LOC and PPA. Compared to the out-of-scene objects, the correlation strengths for within-scene objects were significantly greater in the PPA and two subregions of the LOC: the lateral occipital area (LO), and posterior fusiform area (PF). Further correlation analyses showed that the scene-object correlations were different for indoor and outdoor scenes in the LO, pF and PPA. Semantic associations were represented in the LO and pF, while the PPA was involved in semantic correlations and spatial characteristics, which were sensitive to the openness of scenes. However, these trends were not observed in the RSC, suggesting that it is not recruited to process semantic associations between scenes and objects. Our findings provide an understanding of the neural mechanism of scene recognition.


Assuntos
Associação , Encéfalo/fisiologia , Reconhecimento Visual de Modelos/fisiologia , Semântica , Encéfalo/diagnóstico por imagem , Mapeamento Encefálico , Feminino , Humanos , Imageamento por Ressonância Magnética , Masculino , Análise Multivariada , Testes Neuropsicológicos , Estimulação Luminosa , Análise de Regressão , Adulto Jovem
3.
Front Hum Neurosci ; 12: 419, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-30405375

RESUMO

Emotions can be perceived through the face, body, and whole-person, while previous studies on the abstract representations of emotions only focused on the emotions of the face and body. It remains unclear whether emotions can be represented at an abstract level regardless of all three sensory cues in specific brain regions. In this study, we used the representational similarity analysis (RSA) to explore the hypothesis that the emotion category is independent of all three stimulus types and can be decoded based on the activity patterns elicited by different emotions. Functional magnetic resonance imaging (fMRI) data were collected when participants classified emotions (angry, fearful, and happy) expressed by videos of faces, bodies, and whole-persons. An abstract emotion model was defined to estimate the neural representational structure in the whole-brain RSA, which assumed that the neural patterns were significantly correlated in within-emotion conditions ignoring the stimulus types but uncorrelated in between-emotion conditions. A neural representational dissimilarity matrix (RDM) for each voxel was then compared to the abstract emotion model to examine whether specific clusters could identify the abstract representation of emotions that generalized across stimulus types. The significantly positive correlations between neural RDMs and models suggested that the abstract representation of emotions could be successfully captured by the representational space of specific clusters. The whole-brain RSA revealed an emotion-specific but stimulus category-independent neural representation in the left postcentral gyrus, left inferior parietal lobe (IPL) and right superior temporal sulcus (STS). Further cluster-based MVPA revealed that only the left postcentral gyrus could successfully distinguish three types of emotions for the two stimulus type pairs (face-body and body-whole person) and happy versus angry/fearful, which could be considered as positive versus negative for three stimulus type pairs, when the cross-modal classification analysis was performed. Our study suggested that abstract representations of three emotions (angry, fearful, and happy) could extend from the face and body stimuli to whole-person stimuli and the findings of this study provide support for abstract representations of emotions in the left postcentral gyrus.

4.
Front Hum Neurosci ; 11: 653, 2017.
Artigo em Inglês | MEDLINE | ID: mdl-29375348

RESUMO

Our human brain can rapidly and effortlessly perceive a person's emotional state by integrating the isolated emotional faces and bodies into a whole. Behavioral studies have suggested that the human brain encodes whole persons in a holistic rather than part-based manner. Neuroimaging studies have also shown that body-selective areas prefer whole persons to the sum of their parts. The body-selective areas played a crucial role in representing the relationships between emotions expressed by different parts. However, it remains unclear in which regions the perception of whole persons is represented by a combination of faces and bodies, and to what extent the combination can be influenced by the whole person's emotions. In the present study, functional magnetic resonance imaging data were collected when participants performed an emotion distinction task. Multi-voxel pattern analysis was conducted to examine how the whole person-evoked responses were associated with the face- and body-evoked responses in several specific brain areas. We found that in the extrastriate body area (EBA), the whole person patterns were most closely correlated with weighted sums of face and body patterns, using different weights for happy expressions but equal weights for angry and fearful ones. These results were unique for the EBA. Our findings tentatively support the idea that the whole person patterns are represented in a part-based manner in the EBA, and modulated by emotions. These data will further our understanding of the neural mechanism underlying perceiving emotional persons.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA