Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros








Base de dados
Intervalo de ano de publicação
1.
Proc Natl Acad Sci U S A ; 121(10): e2316306121, 2024 Mar 05.
Artigo em Inglês | MEDLINE | ID: mdl-38408255

RESUMO

Music is powerful in conveying emotions and triggering affective brain mechanisms. Affective brain responses in previous studies were however rather inconsistent, potentially because of the non-adaptive nature of recorded music used so far. Live music instead can be dynamic and adaptive and is often modulated in response to audience feedback to maximize emotional responses in listeners. Here, we introduce a setup for studying emotional responses to live music in a closed-loop neurofeedback setup. This setup linked live performances by musicians to neural processing in listeners, with listeners' amygdala activity was displayed to musicians in real time. Brain activity was measured using functional MRI, and especially amygdala activity was quantified in real time for the neurofeedback signal. Live pleasant and unpleasant piano music performed in response to amygdala neurofeedback from listeners was acoustically very different from comparable recorded music and elicited significantly higher and more consistent amygdala activity. Higher activity was also found in a broader neural network for emotion processing during live compared to recorded music. This finding included observations of the predominance for aversive coding in the ventral striatum while listening to unpleasant music, and involvement of the thalamic pulvinar nucleus, presumably for regulating attentional and cortical flow mechanisms. Live music also stimulated a dense functional neural network with the amygdala as a central node influencing other brain systems. Finally, only live music showed a strong and positive coupling between features of the musical performance and brain activity in listeners pointing to real-time and dynamic entrainment processes.


Assuntos
Música , Música/psicologia , Encéfalo/fisiologia , Emoções/fisiologia , Tonsila do Cerebelo/fisiologia , Afeto , Imageamento por Ressonância Magnética , Percepção Auditiva/fisiologia
2.
Prog Neurobiol ; 214: 102278, 2022 07.
Artigo em Inglês | MEDLINE | ID: mdl-35513165

RESUMO

Affect signaling in human communication involves cortico-limbic brain systems for affect information decoding, such as expressed in vocal intonations during affective speech. Both, the affecto-acoustic speech profile of speakers and the cortico-limbic affect recognition network of listeners were previously identified using non-social and non-adaptive research protocols. However, these protocols neglected the inherent socio-dyadic nature of affective communication, thus underestimating the real-time adaptive dynamics of affective speech that maximize listeners' neural effects and affect recognition. To approximate this socio-adaptive and neural context of affective communication, we used an innovative real-time neuroimaging setup that linked speakers' live affective speech production with listeners' limbic brain signals that served as a proxy for affect recognition. We show that affective speech communication is acoustically more distinctive, adaptive, and individualized in a live adaptive setting and more efficiently capitalizes on neural affect decoding mechanisms in limbic and associated networks than non-adaptive affective speech communication. Only live affective speech produced in adaption to listeners' limbic signals was closely linked to their emotion recognition as quantified by speakers' acoustics and listeners' emotional rating correlations. Furthermore, while live and adaptive aggressive speaking directly modulated limbic activity in listeners, joyful speaking modulated limbic activity in connection with the ventral striatum that is, amongst others, involved in the processing of pleasure. Thus, evolved neural mechanisms for affect decoding seem largely optimized for interactive and individually adaptive communicative contexts.


Assuntos
Fala , Voz , Agressão , Comunicação , Emoções , Humanos
3.
Commun Biol ; 4(1): 801, 2021 06 25.
Artigo em Inglês | MEDLINE | ID: mdl-34172824

RESUMO

The temporal voice areas (TVAs) in bilateral auditory cortex (AC) appear specialized for voice processing. Previous research assumed a uniform functional profile for the TVAs which are broadly spread along the bilateral AC. Alternatively, the TVAs might comprise separate AC nodes controlling differential neural functions for voice and speech decoding, organized as local micro-circuits. To investigate micro-circuits, we modeled the directional connectivity between TVA nodes during voice processing in humans while acquiring brain activity using neuroimaging. Results show several bilateral AC nodes for general voice decoding (speech and non-speech voices) and for speech decoding in particular. Furthermore, non-hierarchical and differential bilateral AC networks manifest distinct excitatory and inhibitory pathways for voice and speech processing. Finally, while voice and speech processing seem to have distinctive but integrated neural circuits in the left AC, the right AC reveals disintegrated neural circuits for both sounds. Altogether, we demonstrate a functional heterogeneity in the TVAs for voice decoding based on local micro-circuits.


Assuntos
Córtex Auditivo/fisiologia , Percepção Auditiva/fisiologia , Rede Nervosa , Percepção da Fala/fisiologia , Adolescente , Adulto , Feminino , Humanos , Masculino , Voz , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA