Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters











Database
Language
Publication year range
1.
Emotion ; 24(1): 39-51, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37166829

ABSTRACT

Emotional attention describes the prioritized processing of emotional information to help humans quickly detect biologically salient stimuli and initiate appropriate reactions. Humans can also voluntarily attend to specific stimulus features that are target-relevant. Electrophysiological studies have shown specific temporal interactions between voluntary and emotional attention, while no such studies exist for natural sounds (e.g., explosions, running water, applause). In two experiments (N = 40, each), we examined event-related potentials (ERPs) toward target relevant or irrelevant negative, neutral, or positive sounds. Target relevance was induced by the instruction to respond blockwise to either negative, neutral, or positive sounds. Emotional sounds elicited increased fronto-central N1 and P2 amplitudes and a larger late positive potential (LPP), with more sustained effects for negative sounds. Target relevance increased amplitudes during an early LPP interval (400-900 ms) but did not interact with the valence of the sounds. These results show early and late ERP modulations for natural sounds, which do not interact with the target relevance of the sound valence, in contrast to findings from the visual domain. Thus, findings indicate little temporal overlap between emotional processes and target relevance effects in the auditory domain. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Subject(s)
Electroencephalography , Evoked Potentials , Humans , Evoked Potentials/physiology , Emotions/physiology , Attention/physiology , Sound
2.
Sci Rep ; 13(1): 16860, 2023 10 06.
Article in English | MEDLINE | ID: mdl-37803129

ABSTRACT

Negative emotional content is prioritized across different stages of information processing as reflected by different components of the event-related potential (ERP). In this preregistered study (N = 40), we investigated how varying the attentional focus allows us to dissociate the involvement of specific ERP components in the processing of negative and neutral words. Participants had to discriminate the orientation of lines overlaid onto the words, the word type (adjective/noun), or the emotional content (negative/neutral). Thus, attention was either not focused on words (distraction task), non-emotional aspects, or the emotional relevance of words. Regardless of the task, there were no significant differences between negative and neutral words for the P1, N1, or P2 components. In contrast, interactions between emotion and task were observed for the early posterior negativity (EPN) and late positive potential (LPP). EPN differences were absent during the distraction task but were present in the other two tasks. LPP emotion differences were found only when attention was directed to the emotional content of words. Our study adds to the evidence that early ERP components do not reliably separate negative and neutral words. However, results show that mid-latency and late stages of emotion processing are separable by different attention tasks. The EPN represents a stage of attentional enhancement of negative words given sufficient attentional resources. Differential activations during the LPP stage are associated with more elaborative processing of the emotional meaning of words.


Subject(s)
Electroencephalography , Word Processing , Humans , Emotions , Evoked Potentials , Attention
3.
Cortex ; 160: 9-23, 2023 03.
Article in English | MEDLINE | ID: mdl-36680924

ABSTRACT

Fearful facial expressions are prioritized across different information processing stages, as evident in early, intermediate, and late components of event-related brain potentials (ERPs). Recent studies showed that, in contrast to early N170 modulations, mid-latency (Early Posterior Negativity, EPN) and late (Late Positive Potential, LPP) emotional modulations depend on the attended perceptual feature. Nevertheless, several studies reported significant differences between emotional and neutral faces for the EPN or LPP components during distraction tasks. One cause for these conflicting findings might be that when faces are presented sufficiently long, participants attend to task-irrelevant features of the faces. In this registered report, we tested whether the presentation duration of faces is the critical factor for differences between reported emotional modulations during perceptual distraction tasks. To this end, 48 participants were required to discriminate the orientation of lines overlaid onto fearful or neutral faces, while face presentation varied (100 msec, 300 msec, 1,000 msec, 2,000 msec). While participants did not need to pay attention to the faces, we observed main effects of emotion for the EPN and LPP, but no interaction between emotion and presentation duration. Of note, unregistered exploratory tests per presentation duration showed no significant EPN and LPP emotion differences during short durations (100 and 300 msec) but significant differences with longer durations. While the presentation duration seems not to be a critical factor for EPN and LPP emotion effects, future studies are needed to investigate the role of threshold effects and the applied analytic designs to explain conflicting findings in the literature.


Subject(s)
Electroencephalography , Fear , Humans , Emotions , Evoked Potentials , Brain , Facial Expression
4.
Sci Rep ; 12(1): 3312, 2022 02 28.
Article in English | MEDLINE | ID: mdl-35228604

ABSTRACT

Encoding often occurs in social contexts, yet research has hardly addressed their role in verbal memory. In three experiments, we investigated the behavioral and neural effects of encoding context on memory for positive, negative, and neutral adjectives, contrasting a social-feedback group (N = 24) with an explicit verbal-learning (N = 24) and a levels-of-processing group (N = 24). Participants in the social-feedback group were not aware of a recognition session one week later, but their memory was better than the explicit learning or the levels-of-processing groups'. However, they also exhibited the strongest response bias, particularly for positive words. Brain event-related potentials (ERPs) revealed largest early negativities (EPN) and late positivities (LPP) in the social-feedback group. Only in the subsequent slow-wave did the explicit learning group show higher amplitudes than the other two groups, suggesting reliance on strategic rather than automatic processes. Still, context-driven incidental encoding outweighed explicit instructions, specifying a decisive role of social factors in memory.


Subject(s)
Evoked Potentials , Recognition, Psychology , Bias , Electroencephalography , Evoked Potentials/physiology , Feedback , Humans , Recognition, Psychology/physiology , Social Environment
5.
Front Psychol ; 10: 94, 2019.
Article in English | MEDLINE | ID: mdl-30774611

ABSTRACT

Recent findings suggest that communicative context affects the timing and magnitude of emotion effects in word processing. In particular, social attributions seem to be one important source of plasticity for the processing of affectively charged language. Here, we investigate the timing and magnitude of ERP responses toward positive, neutral, and negative trait adjectives during the anticipation of putative socio-evaluative feedback from different senders (human and computer) varying in predictability. In the first experiment, during word presentation participants could not anticipate whether a human or a randomly acting computer sender was about to give feedback. Here, a main effect of emotion was observed only on the late positive potential (LPP), showing larger amplitudes for positive compared to neutral adjectives. In the second study the same stimuli and set-up were used, but a block-wise presentation was realized, resulting in fixed and fully predictable sender identity. Feedback was supposedly given by an expert (psychotherapist), a layperson (unknown human), and again by a randomly acting computer. Main effects of emotion started with an increased P1 for negative adjectives, followed by effects at the N1 and early posterior negativity (EPN), showing both largest amplitudes for positive words, as well as for the LPP, where positive and negative words elicited larger amplitudes than neutral words. An interaction revealed that emotional LPP modulations occurred only for a human sender. Finally, regardless of content, anticipating human feedback led to larger P1 and P3 components, being highest for the putative expert. These findings demonstrate the malleability of emotional language processing by social contexts. When clear predictions can be made, our brains rapidly differentiate between emotional and neutral information, as well as between different senders. Attributed human presence affects emotional language processing already during feedback anticipation, in line with a selective gating of attentional resources via anticipatory social significance attributions. By contrast, emotion effects occur much later, when crucial social context information is still missing. These findings demonstrate the context-dependence of emotion effects in word processing and are particularly relevant since virtual communication with unknown senders, whose identity is inferred rather than perceived, has become reality for millions of people.

6.
PLoS One ; 13(9): e0204338, 2018.
Article in English | MEDLINE | ID: mdl-30235321

ABSTRACT

Cognitive processes, such as the generation of language, can be mapped onto the brain using fMRI. These maps can in turn be used for decoding the respective processes from the brain activation patterns. Given individual variations in brain anatomy and organization, analyzes on the level of the single person are important to improve our understanding of how cognitive processes correspond to patterns of brain activity. They also allow to advance clinical applications of fMRI, because in the clinical setting making diagnoses for single cases is imperative. In the present study, we used mental imagery tasks to investigate language production, motor functions, visuo-spatial memory, face processing, and resting-state activity in a single person. Analysis methods were based on similarity metrics, including correlations between training and test data, as well as correlations with maps from the NeuroSynth meta-analysis. The goal was to make accurate predictions regarding the cognitive domain (e.g. language) and the specific content (e.g. animal names) of single 30-second blocks. Four teams used the dataset, each blinded regarding the true labels of the test data. Results showed that the similarity metrics allowed to reach the highest degrees of accuracy when predicting the cognitive domain of a block. Overall, 23 of the 25 test blocks could be correctly predicted by three of the four teams. Excluding the unspecific rest condition, up to 10 out of 20 blocks could be successfully decoded regarding their specific content. The study shows how the information contained in a single fMRI session and in each of its single blocks can allow to draw inferences about the cognitive processes an individual engaged in. Simple methods like correlations between blocks of fMRI data can serve as highly reliable approaches for cognitive decoding. We discuss the implications of our results in the context of clinical fMRI applications, with a focus on how decoding can support functional localization.


Subject(s)
Brain/physiology , Cognition/physiology , Magnetic Resonance Imaging/methods , Adult , Comprehension/physiology , Humans , Male , Rest/physiology , Spatial Memory/physiology
SELECTION OF CITATIONS
SEARCH DETAIL