Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
2.
Soc Cogn Affect Neurosci ; 19(1)2024 Feb 15.
Article in English | MEDLINE | ID: mdl-38334739

ABSTRACT

The role of facial feedback in facial emotion recognition remains controversial, partly due to limitations of the existing methods to manipulate the activation of facial muscles, such as voluntary posing of facial expressions or holding a pen in the mouth. These procedures are indeed limited in their control over which muscles are (de)activated when and to what degree. To overcome these limitations and investigate in a more controlled way if facial emotion recognition is modulated by one's facial muscle activity, we used computer-controlled facial neuromuscular electrical stimulation (fNMES). In a pre-registered EEG experiment, ambiguous facial expressions were categorised as happy or sad by 47 participants. In half of the trials, weak smiling was induced through fNMES delivered to the bilateral Zygomaticus Major muscle for 500 ms. The likelihood of categorising ambiguous facial expressions as happy was significantly increased with fNMES, as shown with frequentist and Bayesian linear mixed models. Further, fNMES resulted in a reduction of P1, N170 and LPP amplitudes. These findings suggest that fNMES-induced facial feedback can bias facial emotion recognition and modulate the neural correlates of face processing. We conclude that fNMES has potential as a tool for studying the effects of facial feedback.


Subject(s)
Facial Recognition , Happiness , Humans , Emotions/physiology , Facial Recognition/physiology , Facial Muscles/physiology , Facial Expression , Bayes Theorem , Electroencephalography , Electric Stimulation
3.
Behav Res Methods ; 2023 Oct 20.
Article in English | MEDLINE | ID: mdl-37864116

ABSTRACT

Facial neuromuscular electrical stimulation (fNMES), which allows for the non-invasive and physiologically sound activation of facial muscles, has great potential for investigating fundamental questions in psychology and neuroscience, such as the role of proprioceptive facial feedback in emotion induction and emotion recognition, and may serve for clinical applications, such as alleviating symptoms of depression. However, despite illustrious origins in the 19th-century work of Duchenne de Boulogne, the practical application of fNMES remains largely unknown to today's researchers in psychology. In addition, published studies vary dramatically in the stimulation parameters used, such as stimulation frequency, amplitude, duration, and electrode size, and in the way they reported them. Because fNMES parameters impact the comfort and safety of volunteers, as well as its physiological (and psychological) effects, it is of paramount importance to establish recommendations of good practice and to ensure studies can be better compared and integrated. Here, we provide an introduction to fNMES, systematically review the existing literature focusing on the stimulation parameters used, and offer recommendations on how to safely and reliably deliver fNMES and on how to report the fNMES parameters to allow better cross-study comparison. In addition, we provide a free webpage, to easily visualise fNMES parameters and verify their safety based on current density. As an example of a potential application, we focus on the use of fNMES for the investigation of the facial feedback hypothesis.

4.
Emotion ; 23(2): 569-588, 2023 Mar.
Article in English | MEDLINE | ID: mdl-35298222

ABSTRACT

Appraisals can be influenced by cultural beliefs and stereotypes. In line with this, past research has shown that judgments about the emotional expression of a face are influenced by the face's sex, and vice versa that judgments about the sex of a person somewhat depend on the person's facial expression. For example, participants associate anger with male faces, and female faces with happiness or sadness. However, the strength and the bidirectionality of these effects remain debated. Moreover, the interplay of a stimulus' emotion and sex remains mostly unknown in the auditory domain. To investigate these questions, we created a novel stimulus set of 121 avatar faces and 121 human voices (available at https://bit.ly/2JkXrpy) with matched, fine-scale changes along the emotional (happy to angry) and sexual (male to female) dimensions. In a first experiment (N = 76), we found clear evidence for the mutual influence of facial emotion and sex cues on ratings, and moreover for larger implicit (task-irrelevant) effects of stimulus' emotion than of sex. These findings were replicated and extended in two preregistered studies-one laboratory categorization study using the same face stimuli (N = 108; https://osf.io/ve9an), and one online study with vocalizations (N = 72; https://osf.io/vhc9g). Overall, results show that the associations of maleness-anger and femaleness-happiness exist across sensory modalities, and suggest that emotions expressed in the face and voice cannot be entirely disregarded, even when attention is mainly focused on determining stimulus' sex. We discuss the relevance of these findings for cognitive and neural models of face and voice processing. (PsycInfo Database Record (c) 2023 APA, all rights reserved).


Subject(s)
Emotions , Judgment , Male , Humans , Female , Happiness , Anger , Sadness , Facial Expression
5.
PLoS One ; 16(7): e0254927, 2021.
Article in English | MEDLINE | ID: mdl-34324534

ABSTRACT

The Islamic headscarf has been in the middle of heated debates in European society, yet little is known about its influence on day-to-day interactions. The aim of this randomized field experiment (n = 840) is to explore how the generally negative views that surround the hijab in Europe manifest in the behavior that people direct to hijab-wearing women in everyday situations. Using a helping scenario and videotapes of the resulting interactions, we measured whether passengers offered assistance and also various details of behavior that indicate interpersonal involvement. We predicted that in interaction with the covered confederate less help would be offered, that women's level of nonverbal involvement would increase but men's decrease, and that responses would be stronger in Paris, intermediate in Brussels, and weaker in Vienna. We analyzed the data using Generalized Linear Models estimated with Bayesian inference. While the headscarf does not produce concluding differences in "overt" helping, it does affect "subtle" cues of interpersonal involvement. In response to the hijab, women across sites increase, but men in Paris decrease, the level of involvement that they show with their nonverbal behavior.


Subject(s)
Clothing , Islam , Adult , Bayes Theorem , Female , Humans , Male
6.
Politics Life Sci ; 34(1): 73-92, 2015.
Article in English | MEDLINE | ID: mdl-26399947

ABSTRACT

The smiles and affiliative expressions of presidential candidates are important for political success, allowing contenders to nonverbally connect with potential supporters and bond with followers. Smiles, however, are not unitary displays; they are multifaceted in composition and signaling intent due to variations in performance. With this in mind, we examine the composition and perception of smiling behavior by Republican presidential candidates during the 2012 preprimary period. In this paper we review literature concerning different smile types and the muscular movements that compose them from a biobehavioral perspective. We then analyze smiles expressed by Republican presidential candidates early in the 2012 primary season by coding facial muscle activity at the microlevel using the Facial Action Coding System (FACS) to produce an inventory of politically relevant smile types. To validate the subtle observed differences between smile types, we show viewers a series of short video clips to differentiate displays on the basis of their perceived reassurance, or social signaling. The discussion considers the implications of our findings in relation to political evaluation and communication efficacy.


Subject(s)
Facial Recognition , Nonverbal Communication , Politics , Smiling , Social Perception , Humans
7.
Front Psychol ; 6: 961, 2015.
Article in English | MEDLINE | ID: mdl-26217280

ABSTRACT

Human multimodal communication can be said to serve two main purposes: information transfer and social influence. In this paper, I argue that different components of multimodal signals play different roles in the processes of information transfer and social influence. Although the symbolic components of communication (e.g., verbal and denotative signals) are well suited to transfer conceptual information, emotional components (e.g., non-verbal signals that are difficult to manipulate voluntarily) likely take a function that is closer to social influence. I suggest that emotion should be considered a property of communicative signals, rather than an entity that is transferred as content by non-verbal signals. In this view, the effect of emotional processes on communication serve to change the quality of social signals to make them more efficient at producing responses in perceivers, whereas symbolic components increase the signals' efficiency at interacting with the cognitive processes dedicated to the assessment of relevance. The interaction between symbolic and emotional components will be discussed in relation to the need for perceivers to evaluate the reliability of multimodal signals.

8.
Emotion ; 15(6): 798-811, 2015 Dec.
Article in English | MEDLINE | ID: mdl-26098733

ABSTRACT

We investigated the role of facial behavior in emotional communication, using both categorical and dimensional approaches. We used a corpus of enacted emotional expressions (GEMEP) in which professional actors are instructed, with the help of scenarios, to communicate a variety of emotional experiences. The results of Study 1 replicated earlier findings showing that only a minority of facial action units are associated with specific emotional categories. Likewise, facial behavior did not show a specific association with particular emotional dimensions. Study 2 showed that facial behavior plays a significant role both in the detection of emotions and in the judgment of their dimensional aspects, such as valence, arousal, dominance, and unpredictability. In addition, a mediation model revealed that the association between facial behavior and recognition of the signaler's emotional intentions is mediated by perceived emotional dimensions. We conclude that, from a production perspective, facial action units convey neither specific emotions nor specific emotional dimensions, but are associated with several emotions and several dimensions. From the perceiver's perspective, facial behavior facilitated both dimensional and categorical judgments, and the former mediated the effect of facial behavior on recognition accuracy. The classification of emotional expressions into discrete categories may, therefore, rely on the perception of more general dimensions such as valence and arousal and, presumably, the underlying appraisals that are inferred from facial movements.


Subject(s)
Communication , Emotions , Face/physiology , Facial Expression , Affect , Arousal , Female , Humans , Intention , Judgment , Male , Movement , Young Adult
9.
Emotion ; 12(4): 701-715, 2012 Aug.
Article in English | MEDLINE | ID: mdl-22642350

ABSTRACT

We tested Ekman's (2003) suggestion that movements of a small number of reliable facial muscles are particularly trustworthy cues to experienced emotion because they tend to be difficult to produce voluntarily. On the basis of theoretical predictions, we identified two subsets of facial action units (AUs): reliable AUs and versatile AUs. A survey on the controllability of facial AUs confirmed that reliable AUs indeed seem more difficult to control than versatile AUs, although the distinction between the two sets of AUs should be understood as a difference in degree of controllability rather than a discrete categorization. Professional actors enacted a series of emotional states using method acting techniques, and their facial expressions were rated by independent judges. The effect of the two subsets of AUs (reliable AUs and versatile AUs) on identification of the emotion conveyed, its perceived authenticity, and perceived intensity was investigated. Activation of the reliable AUs had a stronger effect than that of versatile AUs on the identification, perceived authenticity, and perceived intensity of the emotion expressed. We found little evidence, however, for specific links between individual AUs and particular emotion categories. We conclude that reliable AUs may indeed convey trustworthy information about emotional processes but that most of these AUs are likely to be shared by several emotions rather than providing information about specific emotions. This study also suggests that the issue of reliable facial muscles may generalize beyond the Duchenne smile.


Subject(s)
Expressed Emotion , Facial Expression , Facial Muscles , Female , Humans , Male , Recognition, Psychology , Smiling , Young Adult
10.
Cogn Process ; 13 Suppl 2: 397-414, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22328016

ABSTRACT

The emerging field of social signal processing can benefit from a theoretical framework to guide future research activities. The present article aims at drawing attention to two areas of research that devoted considerable efforts to the understanding of social behaviour: ethology and social psychology. With a long tradition in the study of animal signals, ethology and evolutionary biology have developed theoretical concepts to account for the functional significance of signalling. For example, the consideration of divergent selective pressures responsible for the evolution of signalling and social cognition emphasized the importance of two classes of indicators: informative cues and communicative signals. Social psychology, on the other hand, investigates emotional expression and interpersonal relationships, with a focus on the mechanisms underlying the production and interpretation of social signals and cues. Based on the theoretical considerations developed in these two fields, we propose a model that integrates the processing of perceivable individual features (social signals and cues) with contextual information, and we suggest that output of computer-based processing systems should be derived in terms of functional significance rather than in terms of absolute conceptual meaning.


Subject(s)
Interpersonal Relations , Signal Processing, Computer-Assisted , Social Behavior , Cognition , Communication , Cues , Ethology , Humans , Nonverbal Communication , Social Perception
11.
Politics Life Sci ; 28(1): 48-74, 2009 Mar.
Article in English | MEDLINE | ID: mdl-19803798

ABSTRACT

Research investigating the influence and character of nonverbal leader displays has been carried out in a systematic fashion since the early 1980s, yielding growing insight into how viewers respond to the televised facial display behavior of politicians. This article reviews the major streams of research in this area by considering the key ethological frameworks for understanding dominance relationships between leaders and followers and the role nonverbal communication plays in politics and social organization. The analysis focuses on key categories of facial display behavior by examining an extended selection of published experimental studies considering the influence of nonverbal leader behavior on observers, the nature of stimuli shown to research participants, range of measures employed, and make-up of participant pools. We conclude with suggestions for future research.


Subject(s)
Facial Expression , Leadership , Politics , Television , Emotions , Humans , Nonverbal Communication
12.
Folia Primatol (Basel) ; 79(5): 269-80, 2008.
Article in English | MEDLINE | ID: mdl-18424898

ABSTRACT

The power asymmetry hypothesis claims that individuals should have distinct signals of appeasement/affiliation and play when status difference is high, whereas these signals should overlap in egalitarian interactions. Naturalistic observations were conducted on humans interacting in groups that differed in terms of age composition (and presumably social status). Three affiliative behaviours were recorded by focal sampling: spontaneous smiles, deliberate smiles and laughter. Interestingly, young men showed significantly higher proportions of deliberate smiles in comparison to laughter when interacting with people of a different age class than when interacting in same-age groups. The pattern of affiliative behaviours in women remained unaffected by the age composition of groups. This partly supports the power asymmetry hypothesis and suggests that in men, deliberate smiles could play a role in the regulation of hierarchical relationships.


Subject(s)
Laughter/physiology , Smiling/physiology , Social Dominance , Adult , Age Distribution , Aged , Female , Humans , Male , Middle Aged
SELECTION OF CITATIONS
SEARCH DETAIL
...