Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros

Métodos Terapéuticos y Terapias MTCI
Bases de datos
Tipo del documento
Intervalo de año de publicación
1.
Cogn Affect Behav Neurosci ; 21(1): 231-241, 2021 02.
Artículo en Inglés | MEDLINE | ID: mdl-33474716

RESUMEN

Individuals with a predisposition to empathize engage with sad music in a compelling way, experiencing overall more pleasurable emotions. However, the neural mechanisms underlying these music-related experiences in empathic individuals are unknown. The present study tested whether dispositional empathy modulates neural responses to sad compared with happy music. Twenty-four participants underwent fMRI while listening to 4-min blocks of music evoking sadness or happiness. Using voxel-wise regression, we found a positive correlation between trait empathy (with scores assessed by the Interpersonal Reactivity Index) and eigenvector centrality values in the ventromedial prefrontal cortex (vmPFC), including the medial orbitofrontal cortex (mOFC). We then performed a functional connectivity (FC) analysis to detect network nodes showing stronger FC with the vmPFC/mOFC during the presentation of sad versus happy music. By doing so, we identified a "music-empathy" network (vmPFC/mOFC, dorsomedial prefrontal cortex, primary visual cortex, bilateral claustrum and putamen, and cerebellum) that is spontaneously recruited while listening to sad music and includes brain regions that support the coding of compassion, mentalizing, and visual mental imagery. Importantly, our findings extend the current understanding of empathic behaviors to the musical domain and pinpoint sad music as an effective stimulus to be employed in social neuroscience research.


Asunto(s)
Música , Encéfalo/diagnóstico por imagen , Empatía , Felicidad , Humanos , Tristeza
2.
Soc Cogn Affect Neurosci ; 9(11): 1770-8, 2014 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-24298171

RESUMEN

While watching movies, the brain integrates the visual information and the musical soundtrack into a coherent percept. Multisensory integration can lead to emotion elicitation on which soundtrack valences may have a modulatory impact. Here, dynamic kissing scenes from romantic comedies were presented to 22 participants (13 females) during functional magnetic resonance imaging scanning. The kissing scenes were either accompanied by happy music, sad music or no music. Evidence from cross-modal studies motivated a predefined three-region network for multisensory integration of emotion, consisting of fusiform gyrus (FG), amygdala (AMY) and anterior superior temporal gyrus (aSTG). The interactions in this network were investigated using dynamic causal models of effective connectivity. This revealed bilinear modulations by happy and sad music with suppression effects on the connectivity from FG and AMY to aSTG. Non-linear dynamic causal modeling showed a suppressive gating effect of aSTG on fusiform-amygdalar connectivity. In conclusion, fusiform to amygdala coupling strength is modulated via feedback through aSTG as region for multisensory integration of emotional material. This mechanism was emotion-specific and more pronounced for sad music. Therefore, soundtrack valences may modulate emotion elicitation in movies by differentially changing preprocessed visual information to the amygdala.


Asunto(s)
Amígdala del Cerebelo/fisiología , Percepción Auditiva/fisiología , Emociones/fisiología , Música , Vías Nerviosas/fisiología , Lóbulo Temporal/fisiología , Estimulación Acústica , Adulto , Amígdala del Cerebelo/irrigación sanguínea , Femenino , Humanos , Procesamiento de Imagen Asistido por Computador , Imagen por Resonancia Magnética , Masculino , Persona de Mediana Edad , Modelos Neurológicos , Vías Nerviosas/irrigación sanguínea , Oxígeno/sangre , Estimulación Luminosa , Tiempo de Reacción , Lóbulo Temporal/irrigación sanguínea , Adulto Joven
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA