Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 34
Filter
Add more filters

Country/Region as subject
Affiliation country
Publication year range
1.
Sensors (Basel) ; 20(8)2020 Apr 17.
Article in English | MEDLINE | ID: mdl-32316626

ABSTRACT

In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload, and Frustration. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC), consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.


Subject(s)
Pattern Recognition, Visual/physiology , User-Computer Interface , Emotions/physiology , Humans , Machine Learning
2.
Ergonomics ; 57(3): 374-86, 2014.
Article in English | MEDLINE | ID: mdl-23924061

ABSTRACT

Cognitive-technical intelligence is envisioned to be constantly available and capable of adapting to the user's emotions. However, the question is: what specific emotions should be reliably recognised by intelligent systems? Hence, in this study, we have attempted to identify similarities and differences of emotions between human-human (HHI) and human-machine interactions (HMI). We focused on what emotions in the experienced scenarios of HMI are retroactively reflected as compared with HHI. The sample consisted of N = 145 participants, who were divided into two groups. Positive and negative scenario descriptions of HMI and HHI were given by the first and second groups, respectively. Subsequently, the participants evaluated their respective scenarios with the help of 94 adjectives relating to emotions. The correlations between the occurrences of emotions in the HMI versus HHI were very high. The results do not support the statement that only a few emotions in HMI are relevant.


Subject(s)
Emotions , Interpersonal Relations , Man-Machine Systems , Artificial Intelligence , Factor Analysis, Statistical , Humans , Young Adult
3.
Psychother Psychosom Med Psychol ; 63(8): 327-33, 2013 Aug.
Article in German | MEDLINE | ID: mdl-23468367

ABSTRACT

The dissection course (here abbreviated: PK) is still an obligatory part of medical schools in Germany. In this study we investigated the experiences and burdens of medical students in gross anatomy, especially "distancing from the human body".This study was carried out three times with the self-composed questionnaire BF-PK: before, while and after PK. In total 371 students participated in the PK. 297 students participated at measurement 1. In advance 25-30% of the medical students reported anxiety and emotional inhibition, during the course only 7-10%. The coping strategy "distancing from the human body" was prominent. Anxiety, emotional inhibition and disgust remained for 5-10% of the participants.The gross anatomy course causes emotional stress for a considerable amount of medical students. For those students that were not able to overcome the mental stress themselves service offers should be implemented.


Subject(s)
Adaptation, Psychological , Anatomy/education , Psychological Distance , Students, Medical/psychology , Adult , Anxiety Disorders/diagnosis , Anxiety Disorders/psychology , Attitude to Death , Curriculum , Emotions , Female , Germany , Humans , Inhibition, Psychological , Longitudinal Studies , Male , Psychometrics/statistics & numerical data , Reproducibility of Results , Surveys and Questionnaires , Young Adult
4.
Psychosom Med ; 74(1): 107-13, 2012 Jan.
Article in English | MEDLINE | ID: mdl-22210238

ABSTRACT

OBJECTIVE: Attention and assessment biases are part of body image disturbances shown by patients with anorexia nervosa (AN). The aim of this article was to study these biases by using eye movement analyses. METHODS: As stimuli, the study used 24 standardized pictures showing young women and a standardized picture of the respective study participant. With an eye movement tracer, we were able to determine what body areas that the study participants look at. The study participants were also asked to rate the attractiveness of the stimuli. Data from 35 patients with AN and 32 healthy controls were included. RESULTS: Patients with AN judge their own body areas as being less attractive than the controls on a rating scale from 1 to 5 (e.g., breasts: mean [standard deviation] = 0.9 [1.0] versus 2.2 [0.8], p < .001). They were also more critical in their assessment of the bodies of others (e.g., attractiveness of people with ideal weight: 2.1 [0.9] versus 2.8 [0.5], p < .001). They spent less time looking at their own breasts (1.8 [0.9] versus 2.2 [1.0] seconds, p = .09) but significantly more time at their thighs (1.1 [0.6] versus 0.8 [0.4] seconds, p = .05). CONCLUSIONS: The results confirm the assumption of cognitive biases. The differences, however, are often small and vary greatly.


Subject(s)
Anorexia Nervosa/psychology , Attention/physiology , Body Image , Self Concept , Adult , Analysis of Variance , Beauty , Body Mass Index , Body Weight , Case-Control Studies , Eye Movement Measurements/statistics & numerical data , Eye Movements/physiology , Female , Fixation, Ocular/physiology , Humans , Photic Stimulation/methods , Social Perception , Time Factors , Young Adult
6.
Psychiatry Res ; 183(2): 105-13, 2010 Aug 30.
Article in English | MEDLINE | ID: mdl-20630713

ABSTRACT

The response-focused emotion regulation style 'Expressive suppression' has been associated with symptoms of lower psychological well-being and increased function magnetic resonance imaging (fMRI) activation of the sublenticular extended amygdala (SLEA) in patients with major depression. Extending prior studies on active emotion regulation, we were interested in effects of habitual emotion regulation on neurobiology. Thirty subjects with either relatively high or low suppression scores as assessed with the Emotion Regulation Questionnaire without symptoms of clinical depression participated in the study. They were instructed to expect and then perceive emotionally unpleasant, pleasant or neutral stimuli selected from the International Affective Picture System that were announced by a congruent cue during fMRI. In the subjects with high suppression scores, decreased activation of the orbital medial prefrontal cortex (oMFC) when expecting negative pictures and increased activation of the SLEA upon presentation of neutral stimuli were found. Subclinical depression ratings independently of suppression scores in the healthy subjects were positively correlated with brain activation in the SLEA when expecting negative pictures. SLEA hyperactivity may represent an emotional responsivity that involves less successful habitual emotion regulation and a tendency to depressed mood in healthy subjects, as shown in patients with major depression. Decreased anticipatory oMFC activation may parallel a lack of antecedent emotion regulation in subjects with high suppression scores, representing another neurobiological predictor of lower mental well-being.


Subject(s)
Brain/blood supply , Depression/physiopathology , Depressive Disorder, Major/diagnosis , Emotions , Habituation, Psychophysiologic/physiology , Adult , Brain/pathology , Brain Mapping , Female , Humans , Image Processing, Computer-Assisted/methods , Magnetic Resonance Imaging/methods , Male , Oxygen/blood , Photic Stimulation/methods , Predictive Value of Tests , Psychiatric Status Rating Scales , Regression Analysis , Surveys and Questionnaires , Time Factors , Young Adult
7.
Psychother Psychosom Med Psychol ; 60(5): 169-74, 2010 May.
Article in German | MEDLINE | ID: mdl-19672811

ABSTRACT

This study assessed the correlations between Alexithymia and the recognition and regulation of emotions in a sample of healthy subjects. The first focus was on the relation between self-rated alexithymia (TAS-20) and objectively measured emotion recognition ability from faces and scenic descriptions of social interactions. Furthermore expressive suppression as a means of emotion regulation was related to alexithymia. Using the new factorial structure for the German version of the TAS-20, we were able to show differential effects: Objectively assessed emotion recognition correlated negatively with external thinking and positively with the importance of emotional introspection, but not with the core of alexithymia, particularly difficulties identifying and describing emotions. Expressive suppression on the other hand correlated mainly with this central feature of alexithymia. This overlap of constructs suggests to including complementary test in the assessment of alexithymia.


Subject(s)
Affective Symptoms/psychology , Emotions , Internal-External Control , Adolescent , Adult , Facial Expression , Female , Humans , Inhibition, Psychological , Interpersonal Relations , Male , Pattern Recognition, Visual , Statistics as Topic , Surveys and Questionnaires , Young Adult
8.
Psychiatr Danub ; 22(3): 465-70, 2010 Sep.
Article in English | MEDLINE | ID: mdl-20856194

ABSTRACT

Extreme psychological and physical traumas cause dramatic symptom patterns which are insufficiently described by the psychiatric diagnostic criteria of post traumatic stress disorders (PTSD). Additionally, due to the neurobiological proximity and similarity of processing mechanisms of physical and psychological pain stimulation and extremely negative emotions, the patients often suffer from persistent pains even after the somatic healing process is completed. Epidemiological studies confirm the joint occurrence of pain and PTSD. The close relationship and the etiological and behavioral similarities of both disorders have led to the development of joined vulnerability and mutual maintenance models. The particular suffering of patients with PTSD due to chronic pain necessitates pain-therapeutic interventions. On the other hand, in chronic pain patients, the etiological role of severe traumas should be considered.


Subject(s)
Adaptation, Psychological , Pain/psychology , Stress Disorders, Post-Traumatic/psychology , Wounds and Injuries/psychology , Arousal , Avoidance Learning , Chronic Disease , Comorbidity , Dissociative Disorders/psychology , Dissociative Disorders/therapy , Grief , Humans , Life Change Events , Mental Recall , Pain Management , Social Support , Stress Disorders, Post-Traumatic/therapy
9.
Depress Anxiety ; 26(1): E26-33, 2009.
Article in English | MEDLINE | ID: mdl-19016461

ABSTRACT

OBJECTIVE: The primary aim of this study was to investigate facial emotion recognition in patients with somatoform disorders (SFD). Also of interest was the extent to which concurrent alexithymia contributed to any changes in emotion recognition accuracy. METHODS: Twenty patients with SFD and twenty healthy, age, sex and education matched, controls were assessed with the Facially Expressed Emotion Labelling Test of facial emotion recognition and the 26-item Toronto Alexithymia Scale (TAS-26). RESULTS: Patients with SFD exhibited elevated alexithymia symptoms relative to healthy controls. Patients with SFD also recognized significantly fewer emotional expressions than did the healthy controls. However, the group difference in emotion recognition accuracy became nonsignificant once the influence of alexithymia was controlled for statistically. CONCLUSIONS: This suggests that the deficit in facial emotion recognition observed in the patients with SFD was most likely a consequence of concurrent alexithymia. Impaired facial emotion recognition observed in the patients with SFD could plausibly have a negative influence on these individuals' social functioning.


Subject(s)
Affective Symptoms/psychology , Emotions , Facial Expression , Pattern Recognition, Visual , Recognition, Psychology , Somatoform Disorders/psychology , Adult , Affective Symptoms/diagnosis , Comorbidity , Female , Humans , Male , Middle Aged , Personality Inventory , Reference Values
10.
J Vis Exp ; (146)2019 04 05.
Article in English | MEDLINE | ID: mdl-31009005

ABSTRACT

The assessment of pain relies mostly on methods that require a person to communicate. However, for people with cognitive and verbal impairments, existing methods are not sufficient as they lack reliability and validity. To approach this problem, recent research focuses on an objective pain assessment facilitated by parameters of responses derived from physiology, and video and audio signals. To develop reliable automated pain recognition systems, efforts have been made in creating multimodal databases in order to analyze pain and detect valid pain patterns. While the results are promising, they only focus on discriminating pain or pain intensities versus no pain. In order to advance this, research should also consider the quality and duration of pain as they provide additional valuable information for more advanced pain management. To complement existing databases and the analysis of pain regarding quality and length, this paper proposes a psychophysiological experiment to elicit, measure, and collect valid pain reactions. Participants are subjected to painful stimuli that differ in intensity (low, medium, and high), duration (5 s / 1 min), and modality (heat / electric pain) while audio, video (e.g., facial expressions, body gestures, facial skin temperature), and physiological signals (e.g., electrocardiogram [ECG], skin conductance level [SCL], facial electromyography [EMG], and EMG of M. trapezius) are being recorded. The study consists of a calibration phase to determine a subject's individual pain range (from low to intolerable pain) and a stimulation phase in which pain stimuli, depending on the calibrated range, are applied. The obtained data may allow refining, improving, and evaluating automated recognition systems in terms of an objective pain assessment. For further development of such systems and to investigate pain reactions in more detail, additional pain modalities such as pressure, chemical, or cold pain should be included in future studies. Recorded data of this study will be released as the "X-ITE Pain Database".


Subject(s)
Electric Stimulation/adverse effects , Hot Temperature/adverse effects , Pain Measurement/methods , Pain/psychology , Adult , Electromyography , Facial Expression , Female , Humans , Male , Pain/etiology , Pain/physiopathology , Reproducibility of Results
11.
Depress Anxiety ; 25(11): E133-41, 2008.
Article in English | MEDLINE | ID: mdl-18033726

ABSTRACT

The primary aim of this study was to investigate facial emotion recognition (FER) in patients with somatoform disorders (SFD). Also of interest was the extent to which concurrent alexithymia contributed to any changes in emotion recognition accuracy. Twenty patients with SFD and 20 healthy, age, sex and education matched, controls were assessed with the Facially Expressed Emotion Labelling Test of FER and the 26-item Toronto Alexithymia Scale. Patients with SFD exhibited elevated alexithymia symptoms relative to healthy controls. Patients with SFD also recognized significantly fewer emotional expressions than did the healthy controls. However, the group difference in emotion recognition accuracy became nonsignificant once the influence of alexithymia was controlled for statistics. This suggests that the deficit in FER observed in the patients with SFD was most likely a consequence of concurrent alexithymia. It should be noted that neither depression nor anxiety was significantly related to emotion recognition accuracy, suggesting that these variables did not contribute the emotion recognition deficit. Impaired FER observed in the patients with SFD could plausibly have a negative influence on these individuals' social functioning.


Subject(s)
Affect , Facial Expression , Recognition, Psychology , Somatoform Disorders/epidemiology , Somatoform Disorders/psychology , Adult , Affective Symptoms/diagnosis , Affective Symptoms/epidemiology , Affective Symptoms/psychology , Female , Humans , Male , Somatoform Disorders/diagnosis , Surveys and Questionnaires
12.
PLoS One ; 13(2): e0192767, 2018.
Article in English | MEDLINE | ID: mdl-29444153

ABSTRACT

Pain assessment can benefit from observation of pain behaviors, such as guarding or facial expression, and observational pain scales are widely used in clinical practice with nonverbal patients. However, little is known about head movements and postures in the context of pain. In this regard, we analyze videos of three publically available datasets. The BioVid dataset was recorded with healthy participants subjected to painful heat stimuli. In the BP4D dataset, healthy participants performed a cold-pressor test and several other tasks (meant to elicit emotion). The UNBC dataset videos show shoulder pain patients during range-of-motion tests to their affected and unaffected limbs. In all videos, participants were sitting in an upright position. We studied head movements and postures that occurred during the painful and control trials by measuring head orientation from video over time, followed by analyzing posture and movement summary statistics and occurrence frequencies of typical postures and movements. We found significant differences between pain and control trials with analyses of variance and binomial tests. In BioVid and BP4D, pain was accompanied by head movements and postures that tend to be oriented downwards or towards the pain site. We also found differences in movement range and speed in all three datasets. The results suggest that head movements and postures should be considered for pain assessment and research. As additional pain indicators, they possibly might improve pain management whenever behavior is assessed, especially in nonverbal individuals such as infants or patients with dementia. However, in advance more research is needed to identify specific head movements and postures in pain patients.


Subject(s)
Head/physiopathology , Movement , Posture , Shoulder Pain/physiopathology , Databases, Factual , Humans , Range of Motion, Articular
13.
Front Psychiatry ; 9: 9, 2018.
Article in English | MEDLINE | ID: mdl-29445345

ABSTRACT

BACKGROUND: Social interactive functions such as facial emotion recognition and smell identification have been shown to differ between women and men. However, little is known about how these differences are mirrored in patients with schizophrenia and how these abilities interact with each other and with other clinical variables in patients vs. healthy controls. METHODS: Standardized instruments were used to assess facial emotion recognition [Facially Expressed Emotion Labelling (FEEL)] and smell identification [University of Pennsylvania Smell Identification Test (UPSIT)] in 51 patients with schizophrenia spectrum disorders and 79 healthy controls; furthermore, working memory functions and clinical variables were assessed. RESULTS: In both the univariate and the multivariate results, illness showed a significant influence on UPSIT and FEEL. The inclusion of age and working memory in the MANOVA resulted in a differential effect with sex and working memory as remaining significant factors. Duration of illness was correlated with both emotion recognition and smell identification in men only, whereas immediate general psychopathology and negative symptoms were associated with emotion recognition only in women. CONCLUSION: Being affected by schizophrenia spectrum disorder impacts one's ability to correctly recognize facial affects and identify odors. Converging evidence suggests a link between the investigated basic and social cognitive abilities in patients with schizophrenia spectrum disorders with a strong contribution of working memory and differential effects of modulators in women vs. men.

14.
GMS J Med Educ ; 33(1): Doc4, 2016.
Article in English | MEDLINE | ID: mdl-26958652

ABSTRACT

The FAMULATUR PLUS is an innovative approach to teaching physical examination skills. The concept is aimed at medical students during the clinical part of their studies and includes a clinical traineeship (English for "Famulatur") extended to include various courses ("PLUS"). The courses are divided into clinical examination courses and problembased-learning (PBL) seminars. The concept's special feature is the full integration of these courses into a 30-day hospital traineeship. The aim is to facilitate the transfer of knowledge from the courses into daily practice. Each week of the FAMULATUR PLUS is structured in line with the courses and focuses on a particular part of the body (e.g., abdomen). A physical examination course under the supervision of a physician is offered at the beginning of the week. Here, medical students learn the relevant examination techniques by practicing on each other (partner exercises). Subsequently, the techniques taught are applied independently during everyday work on the ward, corrected by the supervisor, if necessary, and thereby reinforced. The final POL seminar takes place towards the end of the week. Possible differential diagnoses are developed based on a clinical case study. The goal is to check these by taking a fictitious medical history and performing a physical examination, as well as to make a preliminary diagnosis. Finally, during the PBL seminar, medical students will be shown how physical examination techniques can be efficiently applied in the diagnosis of common cardinal symptoms (e.g., abdominal pain). The initial implementation of the FAMULATUR PLUS proved the practical feasibility of the concept. In addition, the accompanying evaluation showed that the participants of the pilot project improved with regard to their practical physical examination skills.


Subject(s)
Clinical Clerkship/methods , Clinical Competence , Internal Medicine/education , Physical Examination/methods , Abdomen, Acute/diagnosis , Abdomen, Acute/etiology , Abdominal Pain/diagnosis , Abdominal Pain/etiology , Adult , Curriculum , Diagnosis, Differential , Female , Germany , Humans , Male , Patient Simulation , Pilot Projects , Problem-Based Learning , Young Adult
15.
PLoS One ; 11(3): e0150584, 2016.
Article in English | MEDLINE | ID: mdl-26939129

ABSTRACT

Affective computing aims at the detection of users' mental states, in particular, emotions and dispositions during human-computer interactions. Detection can be achieved by measuring multimodal signals, namely, speech, facial expressions and/or psychobiology. Over the past years, one major approach was to identify the best features for each signal using different classification methods. Although this is of high priority, other subject-specific variables should not be neglected. In our study, we analyzed the effect of gender, age, personality and gender roles on the extracted psychobiological features (derived from skin conductance level, facial electromyography and heart rate variability) as well as the influence on the classification results. In an experimental human-computer interaction, five different affective states with picture material from the International Affective Picture System and ULM pictures were induced. A total of 127 subjects participated in the study. Among all potentially influencing variables (gender has been reported to be influential), age was the only variable that correlated significantly with psychobiological responses. In summary, the conducted classification processes resulted in 20% classification accuracy differences according to age and gender, especially when comparing the neutral condition with four other affective states. We suggest taking age and gender specifically into account for future studies in affective computing, as these may lead to an improvement of emotion recognition accuracy.


Subject(s)
Behavior/physiology , Emotions/physiology , User-Computer Interface , Aged , Electromyography , Gender Identity , Humans , Personality/physiology , Skin Physiological Phenomena
16.
PLoS One ; 11(1): e0146691, 2016.
Article in English | MEDLINE | ID: mdl-26761427

ABSTRACT

BACKGROUND: Research suggests that interaction between humans and digital environments characterizes a form of companionship in addition to technical convenience. To this effect, humans have attempted to design computer systems able to demonstrably empathize with the human affective experience. Facial electromyography (EMG) is one such technique enabling machines to access to human affective states. Numerous studies have investigated the effects of valence emotions on facial EMG activity captured over the corrugator supercilii (frowning muscle) and zygomaticus major (smiling muscle). The arousal emotion, specifically, has not received much research attention, however. In the present study, we sought to identify intensive valence and arousal affective states via facial EMG activity. METHODS: Ten blocks of affective pictures were separated into five categories: neutral valence/low arousal (0VLA), positive valence/high arousal (PVHA), negative valence/high arousal (NVHA), positive valence/low arousal (PVLA), and negative valence/low arousal (NVLA), and the ability of each to elicit corresponding valence and arousal affective states was investigated at length. One hundred and thirteen participants were subjected to these stimuli and provided facial EMG. A set of 16 features based on the amplitude, frequency, predictability, and variability of signals was defined and classified using a support vector machine (SVM). RESULTS: We observed highly accurate classification rates based on the combined corrugator and zygomaticus EMG, ranging from 75.69% to 100.00% for the baseline and five affective states (0VLA, PVHA, PVLA, NVHA, and NVLA) in all individuals. There were significant differences in classification rate accuracy between senior and young adults, but there was no significant difference between female and male participants. CONCLUSION: Our research provides robust evidences for recognition of intensive valence and arousal affective states in young and senior adults. These findings contribute to the successful future application of facial EMG for identifying user affective states in human machine interaction (HMI) or companion robotic systems (CRS).


Subject(s)
Arousal/physiology , Electromyography/methods , Emotions/physiology , Face/physiology , Adult , Aged , Female , Humans , Male , Middle Aged , Photic Stimulation , Signal Processing, Computer-Assisted , Young Adult
17.
PLoS One ; 10(10): e0140330, 2015.
Article in English | MEDLINE | ID: mdl-26474183

ABSTRACT

BACKGROUND: The clinically used methods of pain diagnosis do not allow for objective and robust measurement, and physicians must rely on the patient's report on the pain sensation. Verbal scales, visual analog scales (VAS) or numeric rating scales (NRS) count among the most common tools, which are restricted to patients with normal mental abilities. There also exist instruments for pain assessment in people with verbal and / or cognitive impairments and instruments for pain assessment in people who are sedated and automated ventilated. However, all these diagnostic methods either have limited reliability and validity or are very time-consuming. In contrast, biopotentials can be automatically analyzed with machine learning algorithms to provide a surrogate measure of pain intensity. METHODS: In this context, we created a database of biopotentials to advance an automated pain recognition system, determine its theoretical testing quality, and optimize its performance. Eighty-five participants were subjected to painful heat stimuli (baseline, pain threshold, two intermediate thresholds, and pain tolerance threshold) under controlled conditions and the signals of electromyography, skin conductance level, and electrocardiography were collected. A total of 159 features were extracted from the mathematical groupings of amplitude, frequency, stationarity, entropy, linearity, variability, and similarity. RESULTS: We achieved classification rates of 90.94% for baseline vs. pain tolerance threshold and 79.29% for baseline vs. pain threshold. The most selected pain features stemmed from the amplitude and similarity group and were derived from facial electromyography. CONCLUSION: The machine learning measurement of pain in patients could provide valuable information for a clinical team and thus support the treatment assessment.


Subject(s)
Electromyography/methods , Pain Measurement/methods , Pain , Signal Processing, Computer-Assisted , Support Vector Machine , Adolescent , Adult , Aged , Female , Humans , Male , Middle Aged , Pain/diagnosis , Pain/physiopathology
18.
Front Psychol ; 6: 262, 2015.
Article in English | MEDLINE | ID: mdl-25852589

ABSTRACT

The perceived duration of emotional face stimuli strongly depends on the expressed emotion. But, emotional faces also differ regarding a number of other features like gaze, face direction, or sex. Usually, these features have been controlled by only using pictures of female models with straight gaze and face direction. Doi and Shinohara (2009) reported that an overestimation of angry faces could only be found when the model's gaze was oriented toward the observer. We aimed at replicating this effect for face direction. Moreover, we explored the effect of face direction on the duration perception sad faces. Controlling for the sex of the face model and the participant, female and male participants rated the duration of neutral, angry, and sad face stimuli of both sexes photographed from different perspectives in a bisection task. In line with current findings, we report a significant overestimation of angry compared to neutral face stimuli that was modulated by face direction. Moreover, the perceived duration of sad face stimuli did not differ from that of neutral faces and was not influenced by face direction. Furthermore, we found that faces of the opposite sex appeared to last longer than those of the same sex. This outcome is discussed with regards to stimulus parameters like the induced arousal, social relevance, and an evolutionary context.

19.
J Craniomaxillofac Surg ; 42(7): 1271-6, 2014 Oct.
Article in English | MEDLINE | ID: mdl-24754915

ABSTRACT

INTRODUCTION: Cleft lip and palate (CLP) represent the most common congenital malformations of the midfacial region. Although these patients show differences in their facial appearance, we hypothesize that CLP-affected individuals do not show an alteration in their emotion regulation abilities compared to unaffected individuals. This is because of the strong biological basis of facial emotion and expression that is inherent and receives little influence from external factors. MATERIAL AND METHODS: The present study evaluated various aspects of emotion regulation in 25 adults with CLP and an equally sized control group of unaffected volunteers. The study was divided into three parts. First, we investigated emotion regulation strategies. Here, each participant was asked to complete the Emotion Regulation Questionnaire (ERQ) and Ambivalence over Emotional Expressiveness Questionnaire G 18 (AEQ-G18). Second, we examined the recognition of facially expressed basic emotions (FEEL test). Third, we evaluated the expression of an emotion induced by an odor sample. RESULTS: Habitual emotion regulation, measured by ERQ and AEQ-G18, was not different between CLP and controls subjects for all of the sub-scales. Recognition of facially expressed basic emotions was also the same for both groups. Facial emotion encoding did not differ for both groups. CONCLUSIONS: To summarize, the findings suggest that individuals with an orofacial cleft show undisturbed emotion regulation and recognition. This may be explained by the strong biological basis of facial emotion recognition and regulation as well as by the healthy emotional resilience and social functioning of CLP patients.


Subject(s)
Cleft Lip/psychology , Cleft Palate/psychology , Emotions , Facial Expression , Adolescent , Adult , Anger , Emotional Adjustment , Facial Muscles/physiopathology , Fear , Female , Happiness , Humans , Male , Middle Aged , Neuropsychological Tests , Odorants , Resilience, Psychological , Young Adult
20.
Behav Brain Res ; 271: 129-39, 2014 Sep 01.
Article in English | MEDLINE | ID: mdl-24928767

ABSTRACT

Alexithymia is a personality trait that involves difficulties identifying emotions and describing feelings. It is hypothesized that this includes facial emotion recognition but limited knowledge exists about possible neural correlates of this assumed deficit. We hence tested thirty-seven healthy subjects with either a relatively high or low degree of alexithymia (HDA versus LDA), who performed in a reliable and standardized test of facial emotion recognition (FEEL, Facially Expressed Emotion Labeling) in the functional MRI. LDA subjects had significantly better emotion recognition scores and showed relatively more activity in several brain areas associated with alexithymia and emotional awareness (anterior cingulate cortex), and the extended system of facial perception concerned with aspects of social communication and emotion (amygdala, insula, striatum). Additionally, LDA subjects had more activity in the visual area of social perception (posterior part of the superior temporal sulcus) and the inferior frontal cortex. HDA subjects, on the other hand, exhibited greater activity in the superior parietal lobule. With differences in behaviour and brain responses between two groups of otherwise healthy subjects, our results indirectly support recent conceptualizations and epidemiological data, that alexithymia is a dimensional personality trait apparent in clinically healthy subjects rather than a categorical diagnosis only applicable to clinical populations.


Subject(s)
Affective Symptoms/psychology , Brain/physiopathology , Emotions , Facial Expression , Magnetic Resonance Imaging , Pattern Recognition, Visual , Adolescent , Adult , Affective Symptoms/physiopathology , Female , Humans , Male , Psychological Tests , Social Perception , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL