Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 16 de 16
Filtrar
1.
Cogn Affect Behav Neurosci ; 18(6): 1269-1282, 2018 12.
Artículo en Inglés | MEDLINE | ID: mdl-30264337

RESUMEN

Emotional situations are typically better remembered than neutral situations, but the psychological conditions and brain mechanisms underlying this effect remain debated. Stimulus valence and affective arousal have been suggested to explain the major role of emotional stimuli in memory facilitation. However, neither valence nor arousal are sufficient affective dimensions to explain the effect of memory facilitation. Several studies showed that negative and positive details are better remembered than neutral details. However, other studies showed that neutral information encoded and coupled with arousal did not result in a memory advantage compared with neutral information not coupled with arousal. Therefore, we suggest that the fundamental affective dimension responsible for memory facilitation is goal relevance. To test this hypothesis at behavioral and neural levels, we conducted a functional magnetic resonance imaging study and used neutral faces embedded in goal-relevant or goal-irrelevant daily life situations. At the behavioral level, we found that neutral faces encountered in goal-relevant situations were better remembered than those encountered in goal-irrelevant situations. To explain this effect, we studied neural activations involved in goal-relevant processing at encoding and in subsequent neutral face recognition. At encoding, activation of emotional brain regions (anterior cingulate, ventral striatum, ventral tegmental area, and substantia nigra) was greater for processing of goal-relevant situations than for processing of goal-irrelevant situations. At the recognition phase, despite the presentation of neutral faces, brain activation involved in social processing (superior temporal sulcus) to successfully remember identities was greater for previously encountered faces in goal-relevant than in goal-irrelevant situations.


Asunto(s)
Nivel de Alerta/fisiología , Encéfalo/diagnóstico por imagen , Cara , Objetivos , Memoria/fisiología , Adulto , Encéfalo/fisiología , Emociones/fisiología , Femenino , Humanos , Imagen por Resonancia Magnética , Masculino , Adulto Joven
2.
Neuroimage ; 100: 608-18, 2014 Oct 15.
Artículo en Inglés | MEDLINE | ID: mdl-24936680

RESUMEN

Efficient perceptual identification of emotionally-relevant stimuli requires optimized neural coding. Because sleep contributes to neural plasticity mechanisms, we asked whether the perceptual representation of emotionally-relevant stimuli within sensory cortices is modified after a period of sleep. We show combined effects of sleep and aversive conditioning on subsequent discrimination of face identity information, with parallel plasticity in the amygdala and visual cortex. After one night of sleep (but neither immediately nor after an equal waking interval), a fear-conditioned face was better detected when morphed with another identity. This behavioral change was accompanied by increased selectivity of the amygdala and face-responsive fusiform regions. Overnight neural changes can thus sharpen the representation of threat-related stimuli in cortical sensory areas, in order to improve detection in impoverished or ambiguous situations. These findings reveal an important role of sleep in shaping cortical selectivity to emotionally-relevant cues and thus promoting adaptive responses to new dangers.


Asunto(s)
Amígdala del Cerebelo/fisiología , Corteza Cerebral/fisiología , Condicionamiento Clásico/fisiología , Cara , Miedo/fisiología , Plasticidad Neuronal/fisiología , Sueño/fisiología , Adulto , Discriminación en Psicología/fisiología , Femenino , Neuroimagen Funcional , Humanos , Imagen por Resonancia Magnética , Masculino , Distribución Aleatoria , Corteza Visual , Adulto Joven
3.
Cereb Cortex ; 22(5): 1107-17, 2012 May.
Artículo en Inglés | MEDLINE | ID: mdl-21750247

RESUMEN

To better define the underlying brain network for the decoding of emotional prosody, we recorded high-resolution brain scans during an implicit and explicit decoding task of angry and neutral prosody. Several subregions in the right superior temporal gyrus (STG) and bilateral in the inferior frontal gyrus (IFG) were sensitive to emotional prosody. Implicit processing of emotional prosody engaged regions in the posterior superior temporal gyrus (pSTG) and bilateral IFG subregions, whereas explicit processing relied more on mid STG, left IFG, amygdala, and subgenual anterior cingulate cortex. Furthermore, whereas some bilateral pSTG regions and the amygdala showed general sensitivity to prosody-specific acoustical features during implicit processing, activity in inferior frontal brain regions was insensitive to these features. Together, the data suggest a differentiated STG, IFG, and subcortical network of brain regions, which varies with the levels of processing and shows a higher specificity during explicit decoding of emotional prosody.


Asunto(s)
Percepción Auditiva/fisiología , Mapeo Encefálico , Encéfalo/fisiología , Vías Nerviosas/fisiología , Adulto , Encéfalo/anatomía & histología , Señales (Psicología) , Emociones/fisiología , Femenino , Humanos , Interpretación de Imagen Asistida por Computador , Imagen por Resonancia Magnética , Masculino , Vías Nerviosas/anatomía & histología , Adulto Joven
4.
Cereb Cortex Commun ; 4(4): tgad019, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-38025828

RESUMEN

Introduction: The ability to process verbal language seems unique to humans and relies not only on semantics but on other forms of communication such as affective vocalizations, that we share with other primate species-particularly great apes (Hominidae). Methods: To better understand these processes at the behavioral and brain level, we asked human participants to categorize vocalizations of four primate species including human, great apes (chimpanzee and bonobo), and monkey (rhesus macaque) during MRI acquisition. Results: Classification was above chance level for all species but bonobo vocalizations. Imaging analyses were computed using a participant-specific, trial-by-trial fitted probability categorization value in a model-based style of data analysis. Model-based analyses revealed the implication of the bilateral orbitofrontal cortex and inferior frontal gyrus pars triangularis (IFGtri) respectively correlating and anti-correlating with the fitted probability of accurate species classification. Further conjunction analyses revealed enhanced activity in a sub-area of the left IFGtri specifically for the accurate classification of chimpanzee calls compared to human voices. Discussion: Our data-that are controlled for acoustic variability between species-therefore reveal distinct frontal mechanisms that shed light on how the human brain evolved to process vocal signals.

5.
Cereb Cortex Commun ; 4(1): tgad002, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36726795

RESUMEN

Vocal emotion recognition, a key determinant to analyzing a speaker's emotional state, is known to be impaired following cerebellar dysfunctions. Nevertheless, its possible functional integration in the large-scale brain network subtending emotional prosody recognition has yet to be explored. We administered an emotional prosody recognition task to patients with right versus left-hemispheric cerebellar lesions and a group of matched controls. We explored the lesional correlates of vocal emotion recognition in patients through a network-based analysis by combining a neuropsychological approach for lesion mapping with normative brain connectome data. Results revealed impaired recognition among patients for neutral or negative prosody, with poorer sadness recognition performances by patients with right cerebellar lesion. Network-based lesion-symptom mapping revealed that sadness recognition performances were linked to a network connecting the cerebellum with left frontal, temporal, and parietal cortices. Moreover, when focusing solely on a subgroup of patients with right cerebellar damage, sadness recognition performances were associated with a more restricted network connecting the cerebellum to the left parietal lobe. As the left hemisphere is known to be crucial for the processing of short segmental information, these results suggest that a corticocerebellar network operates on a fine temporal scale during vocal emotion decoding.

6.
Front Psychol ; 13: 1061930, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36571062

RESUMEN

Introduction: Emotional prosody is defined as suprasegmental and segmental changes in the human voice and related acoustic parameters that can inform the listener about the emotional state of the speaker. While the processing of emotional prosody is well represented in the literature, the mechanism of embodied cognition in emotional voice perception is very little studied. This study aimed to investigate the influence of induced bodily vibrations-through a vibrator placed close to the vocal cords-in the perception of emotional vocalizations. The main hypothesis was that induced body vibrations would constitute a potential interoceptive feedback that can influence the auditory perception of emotions. It was also expected that these effects would be greater for stimuli that are more ambiguous. Methods: Participants were presented with emotional vocalizations expressing joy or anger which varied from low-intensity vocalizations, considered as ambiguous, to high-intensity ones, considered as non-ambiguous. Vibrations were induced simultaneously in half of the trials and expressed joy or anger congruently with the voice stimuli. Participants had to evaluate each voice stimulus using four visual analog scales (joy, anger, and surprise, sadness as control scales). Results: A significant effect of the vibrations was observed on the three behavioral indexes-discrimination, confusion and accuracy-with vibrations confusing rather than facilitating vocal emotion processing. Conclusion: Over all, this study brings new light on a poorly documented topic, namely the potential use of vocal cords vibrations as an interoceptive feedback allowing humans to modulate voice production and perception during social interactions.

7.
Sci Rep ; 11(1): 10645, 2021 05 20.
Artículo en Inglés | MEDLINE | ID: mdl-34017050

RESUMEN

Until recently, brain networks underlying emotional voice prosody decoding and processing were focused on modulations in primary and secondary auditory, ventral frontal and prefrontal cortices, and the amygdala. Growing interest for a specific role of the basal ganglia and cerebellum was recently brought into the spotlight. In the present study, we aimed at characterizing the role of such subcortical brain regions in vocal emotion processing, at the level of both brain activation and functional and effective connectivity, using high resolution functional magnetic resonance imaging. Variance explained by low-level acoustic parameters (fundamental frequency, voice energy) was also modelled. Wholebrain data revealed expected contributions of the temporal and frontal cortices, basal ganglia and cerebellum to vocal emotion processing, while functional connectivity analyses highlighted correlations between basal ganglia and cerebellum, especially for angry voices. Seed-to-seed and seed-to-voxel effective connectivity revealed direct connections within the basal ganglia-especially between the putamen and external globus pallidus-and between the subthalamic nucleus and the cerebellum. Our results speak in favour of crucial contributions of the basal ganglia, especially the putamen, external globus pallidus and subthalamic nucleus, and several cerebellar lobules and nuclei for an efficient decoding of and response to vocal emotions.


Asunto(s)
Ganglios Basales/diagnóstico por imagen , Cerebelo/diagnóstico por imagen , Emociones/fisiología , Imagen por Resonancia Magnética , Voz/fisiología , Estimulación Acústica , Acústica , Adulto , Femenino , Humanos , Masculino , Red Nerviosa/fisiología
8.
Front Neurosci ; 14: 570, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-32581695

RESUMEN

Functional Near-Infrared spectroscopy (fNIRS) is a neuroimaging tool that has been recently used in a variety of cognitive paradigms. Yet, it remains unclear whether fNIRS is suitable to study complex cognitive processes such as categorization or discrimination. Previously, functional imaging has suggested a role of both inferior frontal cortices in attentive decoding and cognitive evaluation of emotional cues in human vocalizations. Here, we extended paradigms used in functional magnetic resonance imaging (fMRI) to investigate the suitability of fNIRS to study frontal lateralization of human emotion vocalization processing during explicit and implicit categorization and discrimination using mini-blocks and event-related stimuli. Participants heard speech-like but semantically meaningless pseudowords spoken in various tones and evaluated them based on their emotional or linguistic content. Behaviorally, participants were faster to discriminate than to categorize; and processed the linguistic faster than the emotional content of stimuli. Interactions between condition (emotion/word), task (discrimination/categorization) and emotion content (anger, fear, neutral) influenced accuracy and reaction time. At the brain level, we found a modulation of the Oxy-Hb changes in IFG depending on condition, task, emotion and hemisphere (right or left), highlighting the involvement of the right hemisphere to process fear stimuli, and of both hemispheres to treat anger stimuli. Our results show that fNIRS is suitable to study vocal emotion evaluation, fostering its application to complex cognitive paradigms.

9.
Soc Cogn Affect Neurosci ; 14(1): 73-80, 2019 01 04.
Artículo en Inglés | MEDLINE | ID: mdl-30418635

RESUMEN

Salient vocalizations, especially aggressive voices, are believed to attract attention due to an automatic threat detection system. However, studies assessing the temporal dynamics of auditory spatial attention to aggressive voices are missing. Using event-related potential markers of auditory spatial attention (N2ac and LPCpc), we show that attentional processing of threatening vocal signals is enhanced at two different stages of auditory processing. As early as 200 ms post-stimulus onset, attentional orienting/engagement is enhanced for threatening as compared to happy vocal signals. Subsequently, as early as 400 ms post-stimulus onset, the reorienting of auditory attention to the center of the screen (or disengagement from the target) is enhanced. This latter effect is consistent with the need to optimize perception by balancing the intake of stimulation from left and right auditory space. Our results extend the scope of theories from the visual to the auditory modality by showing that threatening stimuli also bias early spatial attention in the auditory modality. Attentional enhancement was only present in female and not in male participants.


Asunto(s)
Agresión/psicología , Atención/fisiología , Percepción Espacial/fisiología , Voz , Estimulación Acústica , Adolescente , Adulto , Percepción Auditiva/fisiología , Potenciales Evocados/fisiología , Femenino , Humanos , Masculino , Caracteres Sexuales , Adulto Joven
10.
Sci Rep ; 8(1): 6887, 2018 05 02.
Artículo en Inglés | MEDLINE | ID: mdl-29720691

RESUMEN

Different parts of our brain code the perceptual features and actions related to an object, causing a binding problem, in which the brain has to integrate information related to an event without any interference regarding the features and actions involved in other concurrently processed events. Using a paradigm similar to Hommel, who revealed perception-action bindings, we showed that emotion could bind with motor actions when relevant, and in specific conditions, irrelevant for the task. By adapting our protocol to a functional Magnetic Resonance Imaging paradigm we investigated, in the present study, the neural bases of the emotion-action binding with task-relevant angry faces. Our results showed that emotion bound with motor responses. This integration revealed increased activity in distributed brain areas involved in: (i) memory, including the hippocampi; (ii) motor actions with the precentral gyri; (iii) and emotion processing with the insula. Interestingly, increased activations in the cingulate gyri and putamen, highlighted their potential key role in the emotion-action binding, due to their involvement in emotion processing, motor actions, and memory. The present study confirmed our previous results and point out for the first time the functional brain activity related to the emotion-action association.


Asunto(s)
Ira , Mapeo Encefálico , Expresión Facial , Reconocimiento Facial , Desempeño Psicomotor , Adulto , Femenino , Giro del Cíngulo/fisiología , Hipocampo/fisiología , Humanos , Imagen por Resonancia Magnética , Masculino , Putamen/fisiología
11.
Brain Lang ; 167: 61-71, 2017 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-28173964

RESUMEN

Sleep is involved in the mechanisms underlying memory consolidation and brain plasticity. Consolidation refers to a process through which labile memories are reorganized into more stable ones. An intriguing but often neglected question concerns how pre-existing knowledge is modified when new information enters memory, and whether sleep can influence this process. We investigated how nonword learning may modify the neural representations of closely-related existing words. We also tested whether sleep contributes to any such effect by comparing a group of participants who slept during the night following a first encoding session to a sleep deprived group. Thirty participants were first intensively trained at writing nonwords on Day 1 (remote nonwords) and Day 4 (recent nonwords), following which they underwent functional MRI. This session consisted of a word lexical decision task including words orthographically-close to the trained nonwords, followed by an incidental memory task on the nonwords. Participants who slept detected real words related to remote nonwords faster than those related to recent nonwords, and showed better explicit memory for the remote nonwords. Although the full interaction comparing both groups for these effects was not significant, we found that participants from the sleep-deprivation group did not display such differences between remote and recent conditions. Imaging results revealed that the functional interplay between hippocampus and frontal regions critically mediated these behavioral effects. This study demonstrates that sleep may not only strengthen memory for recently learned items but also promotes a constant reorganization of existing networks of word representations, allowing facilitated access to orthographically-close words.


Asunto(s)
Hipocampo/fisiología , Aprendizaje/fisiología , Reconocimiento Visual de Modelos/fisiología , Semántica , Privación de Sueño/fisiopatología , Sueño/fisiología , Adolescente , Adulto , Mapeo Encefálico/métodos , Femenino , Lóbulo Frontal/diagnóstico por imagen , Lóbulo Frontal/fisiología , Hipocampo/diagnóstico por imagen , Humanos , Imagen por Resonancia Magnética/métodos , Masculino , Adulto Joven
12.
Sci Rep ; 7(1): 16274, 2017 11 24.
Artículo en Inglés | MEDLINE | ID: mdl-29176612

RESUMEN

Perceptual decision-making on emotions involves gathering sensory information about the affective state of another person and forming a decision on the likelihood of a particular state. These perceptual decisions can be of varying complexity as determined by different contexts. We used functional magnetic resonance imaging and a region of interest approach to investigate the brain activation and functional connectivity behind two forms of perceptual decision-making. More complex unbiased decisions on affective voices recruited an extended bilateral network consisting of the posterior inferior frontal cortex, the orbitofrontal cortex, the amygdala, and voice-sensitive areas in the auditory cortex. Less complex biased decisions on affective voices distinctly recruited the right mid inferior frontal cortex, pointing to a functional distinction in this region following decisional requirements. Furthermore, task-induced neural connectivity revealed stronger connections between these frontal, auditory, and limbic regions during unbiased relative to biased decision-making on affective voices. Together, the data shows that different types of perceptual decision-making on auditory emotions have distinct patterns of activations and functional coupling that follow the decisional strategies and cognitive mechanisms involved during these perceptual decisions.


Asunto(s)
Emociones/fisiología , Voz/fisiología , Adulto , Mapeo Encefálico , Toma de Decisiones/fisiología , Femenino , Humanos , Imagen por Resonancia Magnética , Masculino , Adulto Joven
13.
Sci Rep ; 7(1): 16176, 2017 11 23.
Artículo en Inglés | MEDLINE | ID: mdl-29170463

RESUMEN

The present study investigated the extent to which luxury vs. non-luxury brand labels (i.e., extrinsic cues) randomly assigned to items and preferences for these items impact choice, and how this impact may be moderated by materialistic tendencies (i.e., individual characteristics). The main objective was to investigate the neural correlates of abovementioned effects using functional magnetic resonance imaging. Behavioural results showed that the more materialistic people are, the more they choose and like items labelled with luxury brands. Neuroimaging results revealed the implication of a neural network including the dorsolateral and ventromedial prefrontal cortex and the orbitofrontal cortex that was modulated by the brand label and also by the participants' preference. Most importantly, items with randomly assigned luxurious brand labels were preferentially chosen by participants and triggered enhanced signal in the caudate nucleus. This effect increased linearly with materialistic tendencies. Our results highlight the impact of brand-item association, although random in our study, and materialism on preference, relying on subparts of the brain valuation system for the integration of extrinsic cues, preferences and individual characteristics.

14.
Soc Cogn Affect Neurosci ; 11(5): 793-802, 2016 05.
Artículo en Inglés | MEDLINE | ID: mdl-26746180

RESUMEN

The accurate estimation of the proximity of threat is important for biological survival and to assess relevant events of everyday life. We addressed the question of whether proximal as compared with distal vocal threat would lead to a perceptual advantage for the perceiver. Accordingly, we sought to highlight the neural mechanisms underlying the perception of proximal vs distal threatening vocal signals by the use of functional magnetic resonance imaging. Although we found that the inferior parietal and superior temporal cortex of human listeners generally decoded the spatial proximity of auditory vocalizations, activity in the right voice-sensitive auditory cortex was specifically enhanced for proximal aggressive relative to distal aggressive voices as compared with neutral voices. Our results shed new light on the processing of imminent danger signaled by proximal vocal threat and show the crucial involvement of the right mid voice-sensitive auditory cortex in such processing.


Asunto(s)
Corteza Auditiva/fisiología , Percepción Auditiva/fisiología , Mapeo Encefálico/métodos , Miedo/fisiología , Percepción Espacial/fisiología , Voz , Adulto , Femenino , Humanos , Imagen por Resonancia Magnética , Masculino , Localización de Sonidos/fisiología , Adulto Joven
15.
Front Neurosci ; 10: 216, 2016.
Artículo en Inglés | MEDLINE | ID: mdl-27242420

RESUMEN

Emotional stimuli have been shown to modulate attentional orienting through signals sent by subcortical brain regions that modulate visual perception at early stages of processing. Fewer studies, however, have investigated a similar effect of emotional stimuli on attentional orienting in the auditory domain together with an investigation of brain regions underlying such attentional modulation, which is the general aim of the present study. Therefore, we used an original auditory dot-probe paradigm involving simultaneously presented neutral and angry non-speech vocal utterances lateralized to either the left or the right auditory space, immediately followed by a short and lateralized single sine wave tone presented in the same (valid trial) or in the opposite space as the preceding angry voice (invalid trial). Behavioral results showed an expected facilitation effect for target detection during valid trials while functional data showed greater activation in the middle and posterior superior temporal sulci (STS) and in the medial frontal cortex for valid vs. invalid trials. The use of reaction time facilitation [absolute value of the Z-score of valid-(invalid+neutral)] as a group covariate extended enhanced activity in the amygdalae, auditory thalamus, and visual cortex. Taken together, our results suggest the involvement of a large and distributed network of regions among which the STS, thalamus, and amygdala are crucial for the decoding of angry prosody, as well as for orienting and maintaining attention within an auditory space that was previously primed by a vocal emotional event.

16.
Soc Cogn Affect Neurosci ; 11(2): 349-56, 2016 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-26400857

RESUMEN

Our understanding of the role played by the subthalamic nucleus (STN) in human emotion has recently advanced with STN deep brain stimulation, a neurosurgical treatment for Parkinson's disease and obsessive-compulsive disorder. However, the potential presence of several confounds related to pathological models raises the question of how much they affect the relevance of observations regarding the physiological function of the STN itself. This underscores the crucial importance of obtaining evidence from healthy participants. In this study, we tested the structural and functional connectivity between the STN and other brain regions related to vocal emotion in a healthy population by combining diffusion tensor imaging and psychophysiological interaction analysis from a high-resolution functional magnetic resonance imaging study. As expected, we showed that the STN is functionally connected to the structures involved in emotional prosody decoding, notably the orbitofrontal cortex, inferior frontal gyrus, auditory cortex, pallidum and amygdala. These functional results were corroborated by probabilistic fiber tracking, which revealed that the left STN is structurally connected to the amygdala and the orbitofrontal cortex. These results confirm, in healthy participants, the role played by the STN in human emotion and its structural and functional connectivity with the brain network involved in vocal emotions.


Asunto(s)
Emociones/fisiología , Núcleo Subtalámico/fisiología , Conducta Verbal/fisiología , Estimulación Acústica , Adulto , Imagen de Difusión Tensora , Femenino , Humanos , Masculino , Adulto Joven
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA