Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 23.036
Filter
1.
Front Public Health ; 12: 1402801, 2024.
Article in English | MEDLINE | ID: mdl-38765486

ABSTRACT

Background: Negative emotions in college students are a significant factor affecting mental health, with suicide behaviors caused by negative emotions showing an annual increasing trend. Existing studies suggest that physical exercise is essential to alleviate negative feelings, yet the intrinsic mechanisms by which it affects negative emotions have not been fully revealed. Objective: Negative emotions in college students represent a significant issue affecting mental health. This study investigates the relationship between physical exercise and negative emotions among college students, incorporating sleep quality and self-rated health (SRH) as mediators to analyze the pathway mechanism of how physical exercise affects students' negative emotions. Methods: A cross-sectional study design was utilized, employing online questionnaires for investigation. The scales included the Physical Activity Rating Scale-3 (PARS-3), the Depression Anxiety Stress Scales-21 (DASS-21), the Pittsburgh Sleep Quality Index (PSQI), and the 12-Item Short Form Health Survey (SF-12), resulting in the collection of 30,475 valid questionnaires, with a validity rate of 91%. Chain mediation tests and Bootstrap methods were applied for effect analysis. Results: The proportions of university students engaged in low, medium, and high levels of physical exercise were 77.6, 13.1, and 9.3%, respectively. The proportions of students experiencing "very severe" levels of stress, anxiety, and depression were 4.5, 10.9, and 3.6%, respectively. Physical exercise was significantly positively correlated with self-rated health (r = 0.194, p < 0.01), significantly negatively correlated with sleep quality (r = -0.035, p < 0.01), and significantly negatively correlated with stress, anxiety, and depression (r = -0.03, p < 0.01; r = -0.058, p < 0.01; r = -0.055, p < 0.01). Sleep quality was significantly negatively correlated with self-rated health (r = -0.242, p < 0.01). Mediation effect testing indicated that sleep quality and self-rated health partially mediated the relationship between physical exercise and negative emotions, with total effect, total direct effect, and total indirect effect values of -1.702, -0.426, and - 1.277, respectively. Conclusion: College students primarily engage in low-intensity physical activity. Sleep quality and self-rated health mediate the impact of physical exercise on students' negative emotions. A certain level of physical activity can directly affect students' emotional states and indirectly influence their negative emotions via sleep and self-rated health. Regular engagement in physical activities primarily positively impacts emotional states by enhancing mood stability and overall emotional resilience.


Subject(s)
Emotions , Exercise , Sleep Quality , Students , Humans , Male , Students/psychology , Female , Exercise/psychology , Cross-Sectional Studies , Universities , Surveys and Questionnaires , Young Adult , Emotions/physiology , Adult , Adolescent , Depression/psychology , Health Status , Mental Health
2.
PLoS Biol ; 22(5): e3002195, 2024 May.
Article in English | MEDLINE | ID: mdl-38754078

ABSTRACT

People tend to intervene in others' injustices by either punishing the transgressor or helping the victim. Injustice events often occur under stressful circumstances. However, how acute stress affects a third party's intervention in injustice events remains open. Here, we show a stress-induced shift in third parties' willingness to engage in help instead of punishment by acting on emotional salience and central-executive and theory-of-mind networks. Acute stress decreased the third party's willingness to punish the violator and the severity of the punishment and increased their willingness to help the victim. Computational modeling revealed a shift in preference of justice recovery from punishment the offender toward help the victim under stress. This finding is consistent with the increased dorsolateral prefrontal engagement observed with higher amygdala activity and greater connectivity with the ventromedial prefrontal cortex in the stress group. A brain connectivity theory-of-mind network predicted stress-induced justice recovery in punishment. Our findings suggest a neurocomputational mechanism of how acute stress reshapes third parties' decisions by reallocating neural resources in emotional, executive, and mentalizing networks to inhibit punishment bias and decrease punishment severity.


Subject(s)
Punishment , Stress, Psychological , Humans , Punishment/psychology , Male , Stress, Psychological/physiopathology , Stress, Psychological/psychology , Female , Adult , Young Adult , Prefrontal Cortex/physiology , Prefrontal Cortex/physiopathology , Emotions/physiology , Social Justice , Brain/physiology , Magnetic Resonance Imaging
3.
PLoS One ; 19(5): e0301682, 2024.
Article in English | MEDLINE | ID: mdl-38768143

ABSTRACT

AIMS: Alcohol cravings are considered a major factor in relapse among individuals with alcohol use disorder (AUD). This study aims to investigate the frequency and triggers of cravings in the daily lives of people with alcohol-related issues. Large amounts of data are analyzed with Artificial Intelligence (AI) methods to identify possible groupings and patterns. METHODS: For the analysis, posts from the online forum "stopdrinking" on the Reddit platform were used as the dataset from April 2017 to April 2022. The posts were filtered for craving content and processed using the word2vec method to map them into a multi-dimensional vector space. Statistical analyses were conducted to calculate the nature and frequency of craving contexts and triggers (location, time, social environment, and emotions) using word similarity scores. Additionally, the themes of the craving-related posts were semantically grouped using a Latent Dirichlet Allocation (LDA) topic model. The accuracy of the results was evaluated using two manually created test datasets. RESULTS: Approximately 16% of the forum posts discuss cravings. The number of craving-related posts decreases exponentially with the number of days since the author's last alcoholic drink. The topic model confirms that the majority of posts involve individual factors and triggers of cravings. The context analysis aligns with previous craving trigger findings related to the social environment, locations and emotions. Strong semantic craving similarities were found for the emotions boredom, stress and the location airport. The results for each method were successfully validated on test datasets. CONCLUSIONS: This exploratory approach is the first to analyze alcohol cravings in the daily lives of over 24,000 individuals, providing a foundation for further AI-based craving analyses. The analysis confirms commonly known craving triggers and even discovers new important craving contexts.


Subject(s)
Behavior, Addictive , Craving , Natural Language Processing , Humans , Craving/physiology , Behavior, Addictive/psychology , Alcoholism/psychology , Emotions/physiology , Artificial Intelligence , Social Media
4.
Article in English | MEDLINE | ID: mdl-38722724

ABSTRACT

The olfactory system enables humans to smell different odors, which are closely related to emotions. The high temporal resolution and non-invasiveness of Electroencephalogram (EEG) make it suitable to objectively study human preferences for odors. Effectively learning the temporal dynamics and spatial information from EEG is crucial for detecting odor-induced emotional valence. In this paper, we propose a deep learning architecture called Temporal Attention with Spatial Autoencoder Network (TASA) for predicting odor-induced emotions using EEG. TASA consists of a filter-bank layer, a spatial encoder, a time segmentation layer, a Long Short-Term Memory (LSTM) module, a multi-head self-attention (MSA) layer, and a fully connected layer. We improve upon the previous work by utilizing a two-phase learning framework, using the autoencoder module to learn the spatial information among electrodes by reconstructing the given input with a latent representation in the spatial dimension, which aims to minimize information loss compared to spatial filtering with CNN. The second improvement is inspired by the continuous nature of the olfactory process; we propose to use LSTM-MSA in TASA to capture its temporal dynamics by learning the intercorrelation among the time segments of the EEG. TASA is evaluated on an existing olfactory EEG dataset and compared with several existing deep learning architectures to demonstrate its effectiveness in predicting olfactory-triggered emotional responses. Interpretability analyses with DeepLIFT also suggest that TASA learns spatial-spectral features that are relevant to olfactory-induced emotion recognition.


Subject(s)
Algorithms , Attention , Deep Learning , Electroencephalography , Emotions , Neural Networks, Computer , Odorants , Humans , Electroencephalography/methods , Emotions/physiology , Attention/physiology , Male , Adult , Female , Smell/physiology , Memory, Short-Term/physiology , Young Adult
5.
Hum Brain Mapp ; 45(7): e26703, 2024 May.
Article in English | MEDLINE | ID: mdl-38716714

ABSTRACT

The default mode network (DMN) lies towards the heteromodal end of the principal gradient of intrinsic connectivity, maximally separated from the sensory-motor cortex. It supports memory-based cognition, including the capacity to retrieve conceptual and evaluative information from sensory inputs, and to generate meaningful states internally; however, the functional organisation of DMN that can support these distinct modes of retrieval remains unclear. We used fMRI to examine whether activation within subsystems of DMN differed as a function of retrieval demands, or the type of association to be retrieved, or both. In a picture association task, participants retrieved semantic associations that were either contextual or emotional in nature. Participants were asked to avoid generating episodic associations. In the generate phase, these associations were retrieved from a novel picture, while in the switch phase, participants retrieved a new association for the same image. Semantic context and emotion trials were associated with dissociable DMN subnetworks, indicating that a key dimension of DMN organisation relates to the type of association being accessed. The frontotemporal and medial temporal DMN showed a preference for emotional and semantic contextual associations, respectively. Relative to the generate phase, the switch phase recruited clusters closer to the heteromodal apex of the principal gradient-a cortical hierarchy separating unimodal and heteromodal regions. There were no differences in this effect between association types. Instead, memory switching was associated with a distinct subnetwork associated with controlled internal cognition. These findings delineate distinct patterns of DMN recruitment for different kinds of associations yet common responses across tasks that reflect retrieval demands.


Subject(s)
Default Mode Network , Emotions , Magnetic Resonance Imaging , Mental Recall , Semantics , Humans , Male , Female , Adult , Young Adult , Emotions/physiology , Default Mode Network/physiology , Default Mode Network/diagnostic imaging , Mental Recall/physiology , Cerebral Cortex/physiology , Cerebral Cortex/diagnostic imaging , Nerve Net/physiology , Nerve Net/diagnostic imaging , Brain Mapping , Pattern Recognition, Visual/physiology
6.
PLoS One ; 19(5): e0301085, 2024.
Article in English | MEDLINE | ID: mdl-38718018

ABSTRACT

Psychopathy is a severe personality disorder marked by a wide range of emotional deficits, including a lack of empathy, emotion dysregulation, and alexithymia. Previous research has largely examined these emotional impairments in isolation, ignoring their influence on each other. Thus, we examined the concurrent interrelationship between emotional impairments in psychopathy, with a particular focus on the mediating role of alexithymia. Using path analyses with cross-sectional data from a community sample (N = 315) and a forensic sample (N = 50), our results yielded a statistically significant mediating effect of alexithymia on the relationship between psychopathy and empathy (community and forensic) and between psychopathy and emotion dysregulation (community). Moreover, replacing psychopathy with its three dimensions (i.e., meanness, disinhibition, and boldness) in the community sample revealed that boldness may function as an adaptive trait, with lower levels of alexithymia counteracting deficits in empathy and emotion dysregulation. Overall, our findings indicate that psychopathic individuals' limited understanding of their own emotions contributes to their lack of empathy and emotion dysregulation. This underscores the potential benefits of improving emotional awareness in the treatment of individuals with psychopathy.


Subject(s)
Affective Symptoms , Antisocial Personality Disorder , Empathy , Humans , Affective Symptoms/psychology , Affective Symptoms/physiopathology , Empathy/physiology , Male , Adult , Female , Antisocial Personality Disorder/psychology , Cross-Sectional Studies , Middle Aged , Emotions/physiology , Emotional Regulation/physiology , Young Adult
7.
PLoS One ; 19(5): e0303144, 2024.
Article in English | MEDLINE | ID: mdl-38718035

ABSTRACT

Charitable fundraising increasingly relies on online crowdfunding platforms. Project images of charitable crowdfunding use emotional appeals to promote helping behavior. Negative emotions are commonly used to motivate helping behavior because the image of a happy child may not motivate donors to donate as willingly. However, some research has found that happy images can be more beneficial. These contradictory results suggest that the emotional valence of project imagery and how fundraisers frame project images effectively remain debatable. Thus, we compared and analyzed brain activation differences in the prefrontal cortex governing human emotions depending on donation decisions using functional near-infrared spectroscopy, a neuroimaging device. We advance existing theory on charitable behavior by demonstrating that little correlation exists in donation intentions and brain activity between negative and positive project images, which is consistent with survey results on donation intentions by victim image. We also discovered quantitative brain hemodynamic signal variations between donors and nondonors, which can predict and detect donor mental brain functioning using functional connectivity, that is, the statistical dependence between the time series of electrophysiological activity and oxygenated hemodynamic levels in the prefrontal cortex. These findings are critical in developing future marketing strategies for online charitable crowdfunding platforms, especially project images.


Subject(s)
Emotions , Fund Raising , Spectroscopy, Near-Infrared , Humans , Emotions/physiology , Spectroscopy, Near-Infrared/methods , Fund Raising/methods , Female , Male , Adult , Charities , Prefrontal Cortex/physiology , Prefrontal Cortex/diagnostic imaging , Intention , Young Adult , Brain Mapping/methods , Crowdsourcing , Brain/physiology , Brain/diagnostic imaging
8.
Sci Rep ; 14(1): 10607, 2024 05 08.
Article in English | MEDLINE | ID: mdl-38719866

ABSTRACT

Guilt is a negative emotion elicited by realizing one has caused actual or perceived harm to another person. One of guilt's primary functions is to signal that one is aware of the harm that was caused and regrets it, an indication that the harm will not be repeated. Verbal expressions of guilt are often deemed insufficient by observers when not accompanied by nonverbal signals such as facial expression, gesture, posture, or gaze. Some research has investigated isolated nonverbal expressions in guilt, however none to date has explored multiple nonverbal channels simultaneously. This study explored facial expression, gesture, posture, and gaze during the real-time experience of guilt when response demands are minimal. Healthy adults completed a novel task involving watching videos designed to elicit guilt, as well as comparison emotions. During the video task, participants were continuously recorded to capture nonverbal behaviour, which was then analyzed via automated facial expression software. We found that while feeling guilt, individuals engaged less in several nonverbal behaviours than they did while experiencing the comparison emotions. This may reflect the highly social aspect of guilt, suggesting that an audience is required to prompt a guilt display, or may suggest that guilt does not have clear nonverbal correlates.


Subject(s)
Facial Expression , Guilt , Humans , Male , Female , Adult , Young Adult , Nonverbal Communication/psychology , Emotions/physiology , Gestures
9.
Sci Rep ; 14(1): 10371, 2024 05 06.
Article in English | MEDLINE | ID: mdl-38710806

ABSTRACT

Emotion is a human sense that can influence an individual's life quality in both positive and negative ways. The ability to distinguish different types of emotion can lead researchers to estimate the current situation of patients or the probability of future disease. Recognizing emotions from images have problems concealing their feeling by modifying their facial expressions. This led researchers to consider Electroencephalography (EEG) signals for more accurate emotion detection. However, the complexity of EEG recordings and data analysis using conventional machine learning algorithms caused inconsistent emotion recognition. Therefore, utilizing hybrid deep learning models and other techniques has become common due to their ability to analyze complicated data and achieve higher performance by integrating diverse features of the models. However, researchers prioritize models with fewer parameters to achieve the highest average accuracy. This study improves the Convolutional Fuzzy Neural Network (CFNN) for emotion recognition using EEG signals to achieve a reliable detection system. Initially, the pre-processing and feature extraction phases are implemented to obtain noiseless and informative data. Then, the CFNN with modified architecture is trained to classify emotions. Several parametric and comparative experiments are performed. The proposed model achieved reliable performance for emotion recognition with an average accuracy of 98.21% and 98.08% for valence (pleasantness) and arousal (intensity), respectively, and outperformed state-of-the-art methods.


Subject(s)
Electroencephalography , Emotions , Fuzzy Logic , Neural Networks, Computer , Humans , Electroencephalography/methods , Emotions/physiology , Male , Female , Adult , Algorithms , Young Adult , Signal Processing, Computer-Assisted , Deep Learning , Facial Expression
10.
PLoS One ; 19(5): e0302782, 2024.
Article in English | MEDLINE | ID: mdl-38713700

ABSTRACT

Parents with a history of childhood maltreatment may be more likely to respond inadequately to their child's emotional cues, such as crying or screaming, due to previous exposure to prolonged stress. While studies have investigated parents' physiological reactions to their children's vocal expressions of emotions, less attention has been given to their responses when perceiving children's facial expressions of emotions. The present study aimed to determine if viewing facial expressions of emotions in children induces cardiovascular changes in mothers (hypo- or hyper-arousal) and whether these differ as a function of childhood maltreatment. A total of 104 mothers took part in this study. Their experiences of childhood maltreatment were measured using the Childhood Trauma Questionnaire (CTQ). Participants' electrocardiogram signals were recorded during a task in which they viewed a landscape video (baseline) and images of children's faces expressing different intensities of emotion. Heart rate variability (HRV) was extracted from the recordings as an indicator of parasympathetic reactivity. Participants presented two profiles: one group of mothers had a decreased HRV when presented with images of children's facial expressions of emotions, while the other group's HRV increased. However, HRV change was not significantly different between the two groups. The interaction between HRV groups and the severity of maltreatment experienced was marginal. Results suggested that experiences of childhood emotional abuse were more common in mothers whose HRV increased during the task. Therefore, more severe childhood experiences of emotional abuse could be associated with mothers' cardiovascular hyperreactivity. Maladaptive cardiovascular responses could have a ripple effect, influencing how mothers react to their children's facial expressions of emotions. That reaction could affect the quality of their interaction with their child. Providing interventions that help parents regulate their physiological and behavioral responses to stress might be helpful, especially if they have experienced childhood maltreatment.


Subject(s)
Emotions , Facial Expression , Heart Rate , Mothers , Humans , Female , Adult , Heart Rate/physiology , Child , Emotions/physiology , Mothers/psychology , Emotional Abuse/psychology , Male , Electrocardiography , Child Abuse/psychology , Mother-Child Relations/psychology , Surveys and Questionnaires
11.
PLoS One ; 19(5): e0300984, 2024.
Article in English | MEDLINE | ID: mdl-38709789

ABSTRACT

Mentalizing describes the ability to imagine mental states underlying behavior. Furthermore, mentalizing allows one to identify, reflect on, and make sense of one's emotional state as well as to communicate one's emotions to oneself and others. In existing self-report measures, the process of mentalizing emotions in oneself and others was not captured. Therefore, the Mentalizing Emotions Questionnaire (MEQ; current version in German) was developed. In Study 1 (N = 510), we explored the factor structure of the MEQ with an Exploratory Factor Analysis. The factor analysis identified one principal (R2 = .65) and three subfactors: the overall factor was mentalizing emotions, the three subdimensions were self, communicating and other. In Study 2 (N = 509), we tested and confirmed the factor structure of the 16-items MEQ in a Confirmatory Factor Analysis (CFI = .959, RMSEA = .078, SRMR = .04) and evaluated its psychometric properties, which showed excellent internal consistency (α = .92 - .95) and good validity. The MEQ is a valid and reliable instrument which assesses the ability to mentalize emotions provides incremental validity to related constructs such as empathy that goes beyond other mentalization questionnaires.


Subject(s)
Emotions , Mentalization , Psychometrics , Self Report , Humans , Male , Female , Emotions/physiology , Adult , Surveys and Questionnaires , Mentalization/physiology , Psychometrics/methods , Young Adult , Middle Aged , Factor Analysis, Statistical , Adolescent , Theory of Mind , Empathy/physiology , Reproducibility of Results
12.
Curr Biol ; 34(9): R340-R343, 2024 May 06.
Article in English | MEDLINE | ID: mdl-38714159

ABSTRACT

The posterior cerebellum is emerging as a key structure for social cognition. A new study causally demonstrates its early involvement during emotion perception and functional connectivity with the posterior superior temporal sulcus, a cortical hub of the social brain.


Subject(s)
Cerebellum , Social Perception , Humans , Cerebellum/physiology , Emotions/physiology , Social Cognition , Temporal Lobe/physiology
13.
J Neural Eng ; 21(3)2024 May 15.
Article in English | MEDLINE | ID: mdl-38701773

ABSTRACT

Objective. Electroencephalogram (EEG) analysis has always been an important tool in neural engineering, and the recognition and classification of human emotions are one of the important tasks in neural engineering. EEG data, obtained from electrodes placed on the scalp, represent a valuable resource of information for brain activity analysis and emotion recognition. Feature extraction methods have shown promising results, but recent trends have shifted toward end-to-end methods based on deep learning. However, these approaches often overlook channel representations, and their complex structures pose certain challenges to model fitting.Approach. To address these challenges, this paper proposes a hybrid approach named FetchEEG that combines feature extraction and temporal-channel joint attention. Leveraging the advantages of both traditional feature extraction and deep learning, the FetchEEG adopts a multi-head self-attention mechanism to extract representations between different time moments and channels simultaneously. The joint representations are then concatenated and classified using fully-connected layers for emotion recognition. The performance of the FetchEEG is verified by comparison experiments on a self-developed dataset and two public datasets.Main results. In both subject-dependent and subject-independent experiments, the FetchEEG demonstrates better performance and stronger generalization ability than the state-of-the-art methods on all datasets. Moreover, the performance of the FetchEEG is analyzed for different sliding window sizes and overlap rates in the feature extraction module. The sensitivity of emotion recognition is investigated for three- and five-frequency-band scenarios.Significance. FetchEEG is a novel hybrid method based on EEG for emotion classification, which combines EEG feature extraction with Transformer neural networks. It has achieved state-of-the-art performance on both self-developed datasets and multiple public datasets, with significantly higher training efficiency compared to end-to-end methods, demonstrating its effectiveness and feasibility.


Subject(s)
Electroencephalography , Emotions , Humans , Electroencephalography/methods , Emotions/physiology , Deep Learning , Attention/physiology , Neural Networks, Computer , Male , Female , Adult
14.
Nat Commun ; 15(1): 4294, 2024 May 20.
Article in English | MEDLINE | ID: mdl-38769359

ABSTRACT

The ability to respond to emotional events in a context-sensitive and goal-oriented manner is essential for adaptive functioning. In models of behavioral and emotion regulation, the lateral prefrontal cortex (LPFC) is postulated to maintain goal-relevant representations that promote cognitive control, an idea rarely tested with causal inference. Here, we altered mid-LPFC function in healthy individuals using a putatively inhibitory brain stimulation protocol (continuous theta burst; cTBS), followed by fMRI scanning. Participants performed the Affective Go/No-Go task, which requires goal-oriented action during affective processing. We targeted mid-LPFC (vs. a Control site) based on the individualized location of action-goal representations observed during the task. cTBS to mid-LPFC reduced action-goal representations in mid-LPFC and impaired goal-oriented action, particularly during processing of negative emotional cues. During negative-cue processing, cTBS to mid-LPFC reduced functional coupling between mid-LPFC and nodes of the default mode network, including frontopolar cortex-a region thought to modulate LPFC control signals according to internal states. Collectively, these results indicate that mid-LPFC goal-relevant representations play a causal role in governing context-sensitive cognitive control during emotional processing.


Subject(s)
Emotions , Goals , Magnetic Resonance Imaging , Prefrontal Cortex , Transcranial Magnetic Stimulation , Humans , Prefrontal Cortex/physiology , Prefrontal Cortex/diagnostic imaging , Male , Female , Emotions/physiology , Adult , Transcranial Magnetic Stimulation/methods , Young Adult , Brain Mapping , Cognition/physiology , Cues
15.
Sci Rep ; 14(1): 11571, 2024 May 21.
Article in English | MEDLINE | ID: mdl-38773125

ABSTRACT

This study delves into expressing primary emotions anger, happiness, sadness, and fear through drawings. Moving beyond the well-researched color-emotion link, it explores under-examined aspects like spatial concepts and drawing styles. Employing Python and OpenCV for objective analysis, we make a breakthrough by converting subjective perceptions into measurable data through 728 digital images from 182 university students. For the prominent color chosen for each emotion, the majority of participants chose red for anger (73.11%), yellow for happiness (17.8%), blue for sadness (51.1%), and black for fear (40.7%). Happiness led with the highest saturation (68.52%) and brightness (75.44%) percentages, while fear recorded the lowest in both categories (47.33% saturation, 48.78% brightness). Fear, however, topped in color fill percentage (35.49%), with happiness at the lowest (25.14%). Tangible imagery prevailed (71.43-83.52%), with abstract styles peaking in fear representations (28.57%). Facial expressions were a common element (41.76-49.45%). The study achieved an 81.3% predictive accuracy for anger, higher than the 71.3% overall average. Future research can build on these results by improving technological methods to quantify more aspects of drawing content. Investigating a more comprehensive array of emotions and examining factors influencing emotional drawing styles will further our understanding of visual-emotional communication.


Subject(s)
Emotions , Facial Expression , Humans , Emotions/physiology , Male , Female , Young Adult , Happiness , Anger/physiology , Adult , Fear/psychology , Sadness
16.
Sci Rep ; 14(1): 11590, 2024 May 21.
Article in English | MEDLINE | ID: mdl-38773178

ABSTRACT

Human interaction is immersed in laughter; though genuine and posed laughter are acoustically distinct, they are both crucial socio-emotional signals. In this novel study, autistic and non-autistic adults explicitly rated the affective properties of genuine and posed laughter. Additionally, we explored whether their self-reported everyday experiences with laughter differ. Both groups could differentiate between these two types of laughter. However, autistic adults rated posed laughter as more authentic and emotionally arousing than non-autistic adults, perceiving it to be similar to genuine laughter. Autistic adults reported laughing less, deriving less enjoyment from laughter, and experiencing difficulty in understanding the social meaning of other people's laughter compared to non-autistic people. Despite these differences, autistic adults reported using laughter socially as often as non-autistic adults, leveraging it to mediate social contexts. Our findings suggest that autistic adults show subtle differences in their perception of laughter, which may be associated with their struggles in comprehending the social meaning of laughter, as well as their diminished frequency and enjoyment of laughter in everyday scenarios. By combining experimental evidence with first-person experiences, this study suggests that autistic adults likely employ different strategies to understand laughter in everyday contexts, potentially leaving them socially vulnerable in communication.


Subject(s)
Autistic Disorder , Laughter , Humans , Laughter/psychology , Male , Adult , Female , Autistic Disorder/psychology , Autistic Disorder/physiopathology , Young Adult , Emotions/physiology , Middle Aged
17.
Sci Rep ; 14(1): 11397, 2024 05 18.
Article in English | MEDLINE | ID: mdl-38762655

ABSTRACT

Social decision-making is known to be influenced by predictive emotions or the perceived reciprocity of partners. However, the connection between emotion, decision-making, and contextual reciprocity remains less understood. Moreover, arguments suggest that emotional experiences within a social context can be better conceptualised as prosocial rather than basic emotions, necessitating the inclusion of two social dimensions: focus, the degree of an emotion's relevance to oneself or others, and dominance, the degree to which one feels in control of an emotion. For better representation, these dimensions should be considered alongside the interoceptive dimensions of valence and arousal. In an ultimatum game involving fair, moderate, and unfair offers, this online study measured the emotions of 476 participants using a multidimensional affective rating scale. Using unsupervised classification algorithms, we identified individual differences in decisions and emotional experiences. Certain individuals exhibited consistent levels of acceptance behaviours and emotions, while reciprocal individuals' acceptance behaviours and emotions followed external reward value structures. Furthermore, individuals with distinct emotional responses to partners exhibited unique economic responses to their emotions, with only the reciprocal group exhibiting sensitivity to dominance prediction errors. The study illustrates a context-specific model capable of subtyping populations engaged in social interaction and exhibiting heterogeneous mental states.


Subject(s)
Decision Making , Emotions , Humans , Male , Female , Emotions/physiology , Adult , Young Adult , Individuality , Games, Experimental , Social Behavior , Adolescent , Interpersonal Relations
18.
Sci Rep ; 14(1): 10491, 2024 05 07.
Article in English | MEDLINE | ID: mdl-38714729

ABSTRACT

Dogs (Canis lupus familiaris) are the domestically bred descendant of wolves (Canis lupus). However, selective breeding has profoundly altered facial morphologies of dogs compared to their wolf ancestors. We demonstrate that these morphological differences limit the abilities of dogs to successfully produce the same affective facial expressions as wolves. We decoded facial movements of captive wolves during social interactions involving nine separate affective states. We used linear discriminant analyses to predict affective states based on combinations of facial movements. The resulting confusion matrix demonstrates that specific combinations of facial movements predict nine distinct affective states in wolves; the first assessment of this many affective facial expressions in wolves. However, comparative analyses with kennelled rescue dogs revealed reduced ability to predict affective states. Critically, there was a very low predictive power for specific affective states, with confusion occurring between negative and positive states, such as Friendly and Fear. We show that the varying facial morphologies of dogs (specifically non-wolf-like morphologies) limit their ability to produce the same range of affective facial expressions as wolves. Confusion among positive and negative states could be detrimental to human-dog interactions, although our analyses also suggest dogs likely use vocalisations to compensate for limitations in facial communication.


Subject(s)
Domestication , Emotions , Facial Expression , Wolves , Animals , Wolves/physiology , Dogs , Emotions/physiology , Male , Female , Behavior, Animal/physiology , Humans
19.
BMC Psychol ; 12(1): 279, 2024 May 17.
Article in English | MEDLINE | ID: mdl-38755731

ABSTRACT

OBJECTIVE: The somatic symptom disorder (SSD) is characterized by one or more distressing or disabling somatic symptoms accompanied by an excessive amount of time, energy and emotion related to the symptoms. These manifestations of SSD have been linked to alterations in perception and appraisal of bodily signals. We hypothesized that SSD patients would exhibit changes in interoceptive accuracy (IA), particularly when emotional processing is involved. METHODS: Twenty-three patients with SSD and 20 healthy controls were recruited. IA was assessed using the heartbeat perception task. The task was performed in the absence of stimuli as well as in the presence of emotional interference, i.e., photographs of faces with an emotional expression. IA were examined for correlation with measures related to their somatic symptoms, including resting-state heart rate variability (HRV). RESULTS: There was no significant difference in the absolute values of IA between patients with SSD and healthy controls, regardless of the condition. However, the degree of difference in IA without emotional interference and with neutral facial interference was greater in patients with SSD than in healthy controls (p = 0.039). The IA of patients with SSD also showed a significant correlation with low-frequency HRV (p = 0.004) and high-frequency HRV (p = 0.007). CONCLUSION: SSD patients showed more significant changes in IA when neutral facial interference was given. These results suggest that bodily awareness is more affected by emotionally ambiguous stimuli in SSD patients than in healthy controls.


Subject(s)
Emotions , Heart Rate , Interoception , Humans , Female , Male , Interoception/physiology , Adult , Heart Rate/physiology , Emotions/physiology , Middle Aged , Medically Unexplained Symptoms , Somatoform Disorders/psychology , Somatoform Disorders/physiopathology , Facial Expression
20.
PLoS One ; 19(5): e0303755, 2024.
Article in English | MEDLINE | ID: mdl-38758747

ABSTRACT

Recent eye tracking studies have linked gaze reinstatement-when eye movements from encoding are reinstated during retrieval-with memory performance. In this study, we investigated whether gaze reinstatement is influenced by the affective salience of information stored in memory, using an adaptation of the emotion-induced memory trade-off paradigm. Participants learned word-scene pairs, where scenes were composed of negative or neutral objects located on the left or right side of neutral backgrounds. This allowed us to measure gaze reinstatement during scene memory tests based on whether people looked at the side of the screen where the object had been located. Across two experiments, we behaviorally replicated the emotion-induced memory trade-off effect, in that negative object memory was better than neutral object memory at the expense of background memory. Furthermore, we found evidence that gaze reinstatement was related to recognition memory for the object and background scene components. This effect was generally comparable for negative and neutral memories, although the effects of valence varied somewhat between the two experiments. Together, these findings suggest that gaze reinstatement occurs independently of the processes contributing to the emotion-induced memory trade-off effect.


Subject(s)
Emotions , Eye Movements , Eye-Tracking Technology , Memory , Humans , Emotions/physiology , Female , Male , Young Adult , Adult , Memory/physiology , Eye Movements/physiology , Fixation, Ocular/physiology , Adolescent , Recognition, Psychology/physiology , Photic Stimulation
SELECTION OF CITATIONS
SEARCH DETAIL
...