Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 54
Filter
1.
Sci Rep ; 13(1): 17022, 2023 10 09.
Article in English | MEDLINE | ID: mdl-37813928

ABSTRACT

Decoding others' facial expressions is critical for social functioning. To clarify the neural correlates of expression perception depending on where we look on the face, three combined gaze-contingent ERP experiments were analyzed using robust mass-univariate statistics. Regardless of task, fixation location impacted face processing from 50 to 350 ms, maximally around 120 ms, reflecting retinotopic mapping around C2 and P1 components. Fixation location also impacted majorly the N170-P2 interval while weak effects were seen at the face-sensitive N170 peak. Results question the widespread assumption that faces are processed holistically into an indecomposable perceptual whole around the N170. Rather, face processing is a complex and view-dependent process that continues well beyond the N170. Expression and fixation location interacted weakly during the P1-N170 interval, supporting a role for the mouth and left eye in fearful and happy expression decoding. Expression effects were weakest at the N170 peak but strongest around P2, especially for fear, reflecting task-independent affective processing. Results suggest N170 reflects a transition between processes rather than the maximum of a holistic face processing stage. Focus on this peak should be replaced by data-driven analyses of the epoch using robust statistics to fully unravel the early visual processing of faces and their affective content.


Subject(s)
Facial Recognition , Facial Expression , Scalp , Emotions , Evoked Potentials , Mouth , Electroencephalography , Photic Stimulation
2.
Dev Neuropsychol ; 46(8): 598-615, 2021 11.
Article in English | MEDLINE | ID: mdl-34696639

ABSTRACT

We examined behavioral and electrophysiological indices of self-referential and valence processing during a Self-Referential Encoding Task in 9- to 12-year-old children, followed by surprise memory tasks for self- and other-referential trait adjectives. Participants endorsed more positive than negative self-referential information but equally endorsed positive and negative information about the other character. Children demonstrated enhanced parietal LPP amplitudes in response to self- compared to other-referential trait adjectives. Positive and negative information was differentially remembered depending on the order of the referent cues presented, suggesting that social information undergoes differential consolidation processes depending on the referent and the order of presentation.


Subject(s)
Memory Consolidation , Child , Humans , Language
3.
Brain Topogr ; 34(6): 813-833, 2021 11.
Article in English | MEDLINE | ID: mdl-34596796

ABSTRACT

Facial expression processing is a critical component of social cognition yet, whether it is influenced by task demands at the neural level remains controversial. Past ERP studies have found mixed results with classic statistical analyses, known to increase both Type I and Type II errors, which Mass Univariate statistics (MUS) control better. However, MUS open-access toolboxes can use different fundamental statistics, which may lead to inconsistent results. Here, we compared the output of two MUS toolboxes, LIMO and FMUT, on the same data recorded during the processing of angry and happy facial expressions investigated under three tasks in a within-subjects design. Both toolboxes revealed main effects of emotion during the N170 timing and main effects of task during later time points typically associated with the LPP component. Neither toolbox yielded an interaction between the two factors at the group level, nor at the individual level in LIMO, confirming that the neural processing of these two face expressions is largely independent from task demands. Behavioural data revealed main effects of task on reaction time and accuracy, but no influence of expression or an interaction between the two. Expression processing and task demands are discussed in the context of the consistencies and discrepancies between the two toolboxes and existing literature.


Subject(s)
Evoked Potentials , Facial Expression , Electroencephalography , Emotions , Happiness , Humans
4.
J Vis ; 21(11): 5, 2021 10 05.
Article in English | MEDLINE | ID: mdl-34623398

ABSTRACT

Amblyopia is a developmental disorder of vision associated with higher-order visual attention deficits. We explored whether amblyopia affects the orienting of covert spatial attention by measuring the magnitude of the gaze cueing effect from emotional faces. Gaze and emotion cues are key components of social attention. Participants with normal vision (n = 30), anisometropic (n = 7) or strabismic/mixed (n = 5) amblyopia performed a cued peripheral target detection task under monocular and binocular viewing conditions. The cue consisted of a centrally presented face with left or right gaze (50% validity to target location) and a fearful, happy, or neutral expression. The magnitude of spatial cueing was computed as the reaction time difference between congruent and incongruent trials for each expression. Fearful facial expressions oriented spatial attention significantly more than happy or neutral expressions. The magnitude of the gaze cueing effect in our cohort of mild-to-moderate amblyopia was comparable to that in normal vision and was not correlated with the severity of amblyopia. There were no statistical group or amblyopia subtype differences for reaction time in any viewing condition. These results place constraints on the range of attentional mechanisms affected by amblyopia and possibly suggest normal covert processing of emotional face stimuli in mild and moderate amblyopia.


Subject(s)
Amblyopia , Cues , Emotions , Facial Expression , Fixation, Ocular , Humans , Reaction Time
5.
Cortex ; 143: 205-222, 2021 10.
Article in English | MEDLINE | ID: mdl-34455372

ABSTRACT

Looking at someone's eyes is thought to be important for affective theory of mind (aTOM), our ability to infer their emotional state. However, it is unknown whether an individual's gaze direction influences our aTOM judgements and what the time course of this influence might be. We presented participants with sentences describing individuals in positive, negative or neutral scenarios, followed by direct or averted gaze neutral face pictures of those individuals. Participants made aTOM judgements about each person's mental state, including their affective valence and arousal, and we investigated whether the face gaze direction impacted those judgements. Participants rated that gazers were feeling more positive when they displayed direct gaze as opposed to averted gaze, and that they were feeling more aroused during negative contexts when gaze was averted as opposed to direct. Event-related potentials associated with face perception and affective processing were examined using mass-univariate analyses to track the time-course of this eye-gaze and affective processing interaction at a neural level. Both positive and negative trials were differentiated from neutral trials at many stages of processing. This included the early N200 and EPN components, believed to reflect automatic emotion areas activation and attentional selection respectively. This also included the later P300 and LPP components, thought to reflect elaborative cognitive appraisal of emotional content. Critically, sentence valence and gaze direction interacted over these later components, which may reflect the incorporation of eye-gaze in the cognitive evaluation of another's emotional state. The results suggest that gaze perception directly impacts aTOM processes, and that altered eye-gaze processing in clinical populations may contribute to associated aTOM impairments.


Subject(s)
Theory of Mind , Emotions , Evoked Potentials , Fixation, Ocular , Humans , Visual Perception
6.
Front Psychol ; 12: 618606, 2021.
Article in English | MEDLINE | ID: mdl-33790836

ABSTRACT

The gaze cueing effect is characterized by faster attentional orienting to a gazed-at than a non-gazed-at target. This effect is often enhanced when the gazing face bears an emotional expression, though this finding is modulated by a number of factors. Here, we tested whether the type of task performed might be one such modulating factor. Target localization and target discrimination tasks are the two most commonly used gaze cueing tasks, and they arguably differ in cognitive resources, which could impact how emotional expression and gaze cues are integrated to orient attention. In a within-subjects design, participants performed both target localization and discrimination gaze cueing tasks with neutral, happy, and fearful faces. The gaze cueing effect for neutral faces was greatly reduced in the discrimination task relative to the localization task, and the emotional enhancement of the gaze cueing effect was only present in the localization task and only when this task was performed first. These results suggest that cognitive resources are needed for gaze cueing and for the integration of emotional expressions and gaze cues. We propose that a shift toward local processing may be the mechanism by which the discrimination task interferes with the emotional modulation of gaze cueing. The results support the idea that gaze cueing can be greatly modulated by top-down influences and cognitive resources and thus taps into endogenous attention. Results are discussed within the context of the recently proposed EyeTune model of social attention.

7.
Brain Res ; 1765: 147505, 2021 08 15.
Article in English | MEDLINE | ID: mdl-33915164

ABSTRACT

Most ERP studies on facial expressions of emotion have yielded inconsistent results regarding the time course of emotion effects and their possible modulation by task demands. Most studies have used classical statistical methods with a high likelihood of type I and type II errors, which can be limited with Mass Univariate statistics. FMUT and LIMO are currently the only two available toolboxes for Mass Univariate analysis of ERP data and use different fundamental statistics. Yet, no direct comparison of their output has been performed on the same dataset. Given the current push to transition to robust statistics to increase results replicability, here we compared the output of these toolboxes on data previously analyzed using classic approaches (Itier & Neath-Tavares, 2017). The early (0-352 ms) processing of fearful, happy, and neutral faces was investigated under three tasks in a within-subject design that also controlled gaze fixation location. Both toolboxes revealed main effects of emotion and task but neither yielded an interaction between the two, confirming the early processing of fear and happy expressions is largely independent of task demands. Both toolboxes found virtually no difference between neutral and happy expressions, while fearful (compared to neutral and happy) expressions modulated the N170 and EPN but elicited maximum effects after the N170 peak, around 190 ms. Similarities and differences in the spatial and temporal extent of these effects are discussed in comparison to the published classical analysis and the rest of the ERP literature.


Subject(s)
Analysis of Variance , Facial Recognition/physiology , Data Interpretation, Statistical , Electroencephalography/methods , Emotions , Evoked Potentials/physiology , Facial Expression , Fear , Female , Happiness , Humans , Male , Models, Statistical , Photic Stimulation/methods , Reaction Time/physiology , Young Adult
8.
Psychon Bull Rev ; 28(1): 283-291, 2021 Feb.
Article in English | MEDLINE | ID: mdl-32959191

ABSTRACT

Self-relevant stimuli (i.e. meaningful/important to the observer and related to the self) are typically remembered better than other-relevant stimuli. However, whether a self-relevance memory benefit could be conferred to a novel neutral face, remains to be seen. Recent studies have shown that emotional responses to neutral faces can be altered by using a preceding sentence as context that varies in terms of self-relevance (self/other-relevant) and valence (positive/negative; e.g. "S/he thinks your comment is dumb/smart"). We adapted this paradigm to investigate whether the context conferred by the preceding sentence also impacts memorability of the subsequently presented face. Participants saw faces primed with contextual sentences and rated how aroused, and how positive or negative, the faces made them feel. Later incidental recognition accuracy for the faces was greater when these had been preceded by self-relevant compared to other-relevant sentences. Faces preceded by self-relevant contexts were also rated as more arousing. There was no impact of sentence valence on arousal ratings or on recognition memory for faces. Sentence self-relevance and valence interacted to affect participants' ratings of how positive or negative the faces made them feel during encoding, but did not interact to impact later recognition. Our results indicate that initial social encounters can have a lasting effect on one's memory of another person, producing an enhanced memory trace of that individual. We propose that the effect is driven by an arousal-based mechanism, elicited by faces perceived to be self-relevant.


Subject(s)
Emotions/physiology , Facial Recognition/physiology , Mental Recall/physiology , Recognition, Psychology/physiology , Social Interaction , Social Perception , Adult , Ego , Female , Humans , Male , Semantics , Young Adult
9.
Neuroimage ; 226: 117605, 2021 02 01.
Article in English | MEDLINE | ID: mdl-33271267

ABSTRACT

Looking at the eyes informs us about the thoughts and emotions of those around us, and impacts our own emotional state. However, it is unknown how perceiving direct and averted gaze impacts our ability to share the gazer's positive and negative emotions, abilities referred to as positive and negative affective empathy. We presented 44 participants with contextual sentences describing positive, negative and neutral events happening to other people (e.g. "Her newborn was saved/killed/fed yesterday afternoon."). These were designed to elicit positive, negative, or little to no empathy, and were followed by direct or averted gaze images of the individuals described. Participants rated their affective empathy for the individual and their own emotional valence on each trial. Event-related potentials time-locked to face-onset and associated with empathy and emotional processing were recorded to investigate whether they were modulated by gaze direction. Relative to averted gaze, direct gaze was associated with increased positive valence in the positive and neutral conditions and with increased positive empathy ratings. A similar pattern was found at the neural level, using robust mass-univariate statistics. The N100, thought to reflect an automatic activation of emotion areas, was modulated by gaze in the affective empathy conditions, with opposite effect directions in positive and negative conditions.. The P200, an ERP component sensitive to positive stimuli, was modulated by gaze direction only in the positive empathy condition. Positive and negative trials were processed similarly at the early N200 processing stage, but later diverged, with only negative trials modulating the EPN, P300 and LPP components. These results suggest that positive and negative affective empathy are associated with distinct time-courses, and that perceived gaze direction uniquely modulates positive empathy, highlighting the importance of studying empathy with face stimuli.


Subject(s)
Brain/physiology , Emotions/physiology , Empathy/physiology , Evoked Potentials/physiology , Fixation, Ocular , Adolescent , Electroencephalography/methods , Facial Expression , Female , Humans , Male , Visual Perception/physiology
10.
Brain Cogn ; 142: 105569, 2020 07.
Article in English | MEDLINE | ID: mdl-32388193

ABSTRACT

Healthy adults typically display enhanced processing for self- (relative to other-) relevant and positive (relative to negative) information. However, it is unclear whether these two biases interact to form a self-positivity bias, whereby self-positive information receives prioritized processing. It is also unclear how a blocked versus mixed referent design impacts reference and valence processing. We addressed these questions using behavioral and electrophysiological indices across two studies using a Self-Referential Encoding Task, followed by surprise recall and recognition tasks. Early (P1) and late (LPP) event-related potentials were time-locked to a series of trait adjectives, encoded relative to oneself or a fictional character, with referent presented in a blocked (Exp. 1) or mixed (Exp. 2) trial design. Regardless of study design, participants recalled and recognized more self- than other-relevant adjectives, and recognized more positive than negative adjectives. Additionally, participants demonstrated larger LPP amplitudes for self-relevant and positive adjectives. The LPP self-relevance effect emerged earlier and persisted longer in the blocked (400-800 ms) versus mixed design (600-800 ms). The LPP valence effect was not apparent in the blocked design, but appeared late in the mixed design (600-1200 ms). Critically, the interaction between self-relevance and valence appeared only behaviorally in the mixed design, suggesting that overall self-relevance and valence independently impact neural socio-cognitive processing.


Subject(s)
Self Concept , Evoked Potentials , Humans , Language , Mental Recall , Recognition, Psychology
11.
Brain Res ; 1722: 146343, 2019 11 01.
Article in English | MEDLINE | ID: mdl-31336099

ABSTRACT

The LIFTED model of early face perception postulates that the face-sensitive N170 event-related potential may reflect underlying neural inhibition mechanisms which serve to regulate holistic and featural processing. It remains unclear, however, what specific factors impact these neural inhibition processes. Here, N170 peak responses were recorded whilst adults maintained fixation on a single eye using a gaze-contingent paradigm, and the presence/absence of a face outline, as well as the number and type of parafoveal features within the outline, were manipulated. N170 amplitudes and latencies were reduced when a single eye was fixated within a face outline compared to fixation on the same eye in isolation, demonstrating that the simple presence of a face outline is sufficient to elicit a shift towards a more face-like neural response. A monotonic decrease in the N170 amplitude and latency was observed with increasing numbers of parafoveal features, and the type of feature(s) present in parafovea further modulated this early face response. These results support the idea of neural inhibition exerted by parafoveal features onto the foveated feature as a function of the number, and possibly the nature, of parafoveal features. Specifically, the results suggest the use of a feature saliency framework (eyes > mouth > nose) at the neural level, such that the parafoveal eye may play a role in down-regulating the response to the other eye (in fovea) more so than the nose or the mouth. These results confirm the importance of parafoveal features and the face outline in the neural inhibition mechanism, and provide further support for a feature saliency mechanism guiding early face perception.


Subject(s)
Brain/physiology , Evoked Potentials, Visual , Facial Recognition/physiology , Neural Inhibition/physiology , Adult , Electroencephalography , Eye , Eye Movement Measurements , Face , Female , Fixation, Ocular , Humans , Male , Photic Stimulation , Young Adult
12.
Heliyon ; 5(4): e01583, 2019 Apr.
Article in English | MEDLINE | ID: mdl-31183437

ABSTRACT

Our attention is spontaneously oriented in the direction where others are looking. This attention shift manifests as faster responses to peripheral targets when they are gazed at by a central face instead of gazed away from, and this effect is even more pronounced when the face expresses an emotion. This so called gaze-cuing effect, and its enhancement by emotion, is thought to reflect covert attention orienting. However, eye movements are typically not monitored in gaze-cuing paradigms, yet free viewing and saccadic reaction time research suggests individuals commonly and quickly look at gazed-at locations. Furthermore, in dynamic gaze-cuing studies, emotional faces differ from neutral faces in their affective content but also in their apparent facial motion, both of which could affect participants' eye-movements. We investigated the contribution of overt orienting to the gaze-cuing effect by monitoring eye-movements during emotional and neutral gaze-cuing trials. We found that eye-movements were infrequent, and when they occurred, they were directed toward the target, not toward the gazed-at location. Removing trials with eye-movements did not affect gaze-cuing much, confirming it reflects a covert attention process. However, participants were more likely to move their eyes during neutral trials, which lacked perceived face movement, than during emotion trials or neutral movement trials. Including these eye-movement contaminated trials in our analysis resulted in an impaired ability to detect the gaze-cuing variations with emotion. In contrast, removing trials with eye-movements, or including a neutral movement control such as a neutral tongue protrusion, revealed more subtle emotional modulation of gaze-cuing.

13.
Front Neurosci ; 13: 517, 2019.
Article in English | MEDLINE | ID: mdl-31178686

ABSTRACT

The perception of eye-gaze is thought to be a key component of our everyday social interactions. While the neural correlates of direct and averted gaze processing have been investigated, there is little consensus about how these gaze directions may be processed differently as a function of the task being performed. In a within-subject design, we examined how perception of direct and averted gaze affected performance on tasks requiring participants to use directly available facial cues to infer the individuals' emotional state (emotion discrimination), direction of attention (attention discrimination) and gender (gender discrimination). Neural activity was recorded throughout the three tasks using EEG, and ERPs time-locked to face onset were analyzed. Participants were most accurate at discriminating emotions with direct gaze faces, but most accurate at discriminating attention with averted gaze faces, while gender discrimination was not affected by gaze direction. At the neural level, direct and averted gaze elicited different patterns of activation depending on the task over frontal sites, from approximately 220-290 ms. More positive amplitudes were seen for direct than averted gaze in the emotion discrimination task. In contrast, more positive amplitudes were seen for averted gaze than for direct gaze in the gender discrimination task. These findings are among the first direct evidence that perceived gaze direction modulates neural activity differently depending on task demands, and that at the behavioral level, specific gaze directions functionally overlap with emotion and attention discrimination, precursors to more elaborated theory of mind processes.

14.
Brain Sci ; 9(5)2019 May 17.
Article in English | MEDLINE | ID: mdl-31109022

ABSTRACT

Faces showing expressions of happiness or anger were presented together with sentences that described happiness-inducing or anger-inducing situations. Two main variables were manipulated: (i) congruency between contexts and expressions (congruent/incongruent) and (ii) the task assigned to the participant, discriminating the emotion shown by the target face (emotion task) or judging whether the expression shown by the face was congruent or not with the context (congruency task). Behavioral and electrophysiological results (event-related potentials (ERP)) showed that processing facial expressions was jointly influenced by congruency and task demands. ERP results revealed task effects at frontal sites, with larger positive amplitudes between 250-450 ms in the congruency task, reflecting the higher cognitive effort required by this task. Effects of congruency appeared at latencies and locations corresponding to the early posterior negativity (EPN) and late positive potential (LPP) components that have previously been found to be sensitive to emotion and affective congruency. The magnitude and spatial distribution of the congruency effects varied depending on the task and the target expression. These results are discussed in terms of the modulatory role of context on facial expression processing and the different mechanisms underlying the processing of expressions of positive and negative emotions.

15.
Cogn Emot ; 33(4): 768-800, 2019 06.
Article in English | MEDLINE | ID: mdl-29983094

ABSTRACT

Gaze-cuing refers to the spontaneous orienting of attention towards a gazed-at location, characterised by shorter response times to gazed-at than non-gazed at targets. Previous research suggests that processing of these gaze cues interacts with the processing of facial expression cues to enhance gaze-cuing. However, whether only negative emotions (which signal potential threat or uncertainty) can enhance gaze-cuing is still debated, and whether this emotional modulation varies as a function of individual differences still remains largely unclear. Combining data from seven experiments, we investigated the emotional modulation of gaze-cuing in the general population as a function of participant sex, and self-reported subclinical trait anxiety, depression, and autistic traits. We found that (i) emotional enhancement of gaze-cuing can occur for both positive and negative expressions, (ii) the higher the score on the Attention to Detail subscale of the Autism Spectrum Quotient, the smaller the emotional enhancement of gaze-cuing, especially for happy expressions, and (iii) emotional modulation of gaze-cuing does not vary as a function of participant anxiety, depression or sex, although women display an overall larger gaze-cuing effect than men.


Subject(s)
Attention/physiology , Cues , Emotions/physiology , Facial Expression , Fixation, Ocular/physiology , Individuality , Adolescent , Adult , Female , Humans , Male , Reaction Time/physiology , Young Adult
16.
Cortex ; 109: 35-49, 2018 12.
Article in English | MEDLINE | ID: mdl-30286305

ABSTRACT

The N170 event-related potential component is an early marker of face perception that is particularly sensitive to isolated eye regions and to eye fixations within a face. Here, this eye sensitivity was tested further by measuring the N170 to isolated facial features and to the same features fixated within a face, using a gaze-contingent procedure. The neural response to single isolated eyes and eye regions (two eyes) was also compared. Pixel intensity and contrast were controlled at the global (image) and local (featural) levels. Consistent with previous findings, larger N170 amplitudes were elicited when the left or right eye was fixated within a face, compared to the mouth or nose, demonstrating that the N170 eye sensitivity reflects higher-order perceptual processes and not merely low-level perceptual effects. The N170 was also largest and most delayed for isolated features, compared to equivalent fixations within a face. Specifically, mouth fixation yielded the largest amplitude difference, and nose fixation yielded the largest latency difference between these two contexts, suggesting the N170 may reflect a complex interplay between holistic and featural processes. Critically, eye regions elicited consistently larger and shorter N170 responses compared to single eyes, with enhanced responses for contralateral eye content, irrespective of eye or nasion fixation. These results confirm the importance of the eyes in early face perception, and provide novel evidence of an increased sensitivity to the presence of two symmetric eyes compared to only one eye, consistent with a neural eye region detector rather than an eye detector per se.


Subject(s)
Evoked Potentials, Visual/physiology , Facial Recognition/physiology , Neural Inhibition/physiology , Visual Cortex/physiology , Visual Perception/physiology , Electroencephalography , Female , Functional Laterality/physiology , Humans , Male , Orientation/physiology , Photic Stimulation , Young Adult
17.
Brain Topogr ; 31(6): 972-984, 2018 11.
Article in English | MEDLINE | ID: mdl-29987641

ABSTRACT

The N170 ERP component is a central neural marker of early face perception usually thought to reflect holistic processing. However, it is also highly sensitive to eyes presented in isolation and to fixation on the eyes within a full face. The lateral inhibition face template and eye detector (LIFTED) model (Nemrodov et al. in NeuroImage 97:81-94, 2014) integrates these views by proposing a neural inhibition mechanism that perceptually glues features into a whole, in parallel to the activity of an eye detector that accounts for the eye sensitivity. The LIFTED model was derived from a large number of results obtained with intact and eyeless faces presented upright and inverted. The present study provided a control condition to the original design by replacing eyeless with mouthless faces, hereby enabling testing of specific predictions derived from the model. Using the same gaze-contingent approach, we replicated the N170 eye sensitivity regardless of face orientation. Furthermore, when eyes were fixated in upright faces, the N170 was larger for mouthless compared to intact faces, while inverted mouthless faces elicited smaller amplitude than intact inverted faces when fixation was on the mouth and nose. The results are largely in line with the LIFTED model, in particular with the idea of an inhibition mechanism involved in holistic processing of upright faces and the lack of such inhibition in processing inverted faces. Some modifications to the original model are also proposed based on these results.


Subject(s)
Evoked Potentials/physiology , Eye , Facial Recognition/physiology , Neural Inhibition , Adolescent , Electroencephalography , Female , Humans , Male , Orientation, Spatial , Photic Stimulation , Young Adult
18.
Biol Psychol ; 135: 47-64, 2018 05.
Article in English | MEDLINE | ID: mdl-29524467

ABSTRACT

Most face processing research has investigated how we perceive faces presented by themselves, but we view faces everyday within a rich social context. Recent ERP research has demonstrated that context cues, including self-relevance and valence, impact electrocortical and emotional responses to neutral faces. However, the time-course of these effects is still unclear, and it is unknown whether these effects interact with the face gaze direction, a cue that inherently contains self-referential information and triggers emotional responses. We primed direct and averted gaze neutral faces (gaze manipulation) with contextual sentences that contained positive or negative opinions (valence manipulation) about the participants or someone else (self-relevance manipulation). In each trial, participants rated how positive or negative, and how affectively aroused, the face made them feel. Eye-tracking ensured sentence reading and face fixation while ERPs were recorded to face presentations. Faces put into self-relevant contexts were more arousing than those in other-relevant contexts, and elicited ERP differences from 150 to 750 ms post-face, encompassing EPN and LPP components. Self-relevance interacted with valence at both the behavioural and ERP level starting 150 ms post-face. Finally, faces put into positive, self-referential contexts elicited different N170 ERP amplitudes depending on gaze direction. Behaviourally, direct gaze elicited more positive valence ratings than averted gaze during positive, self-referential contexts. Thus, self-relevance and valence contextual cues impact visual perception of neutral faces and interact with gaze direction during the earliest stages of face processing. The results highlight the importance of studying face processing within contexts mimicking the complexities of real world interactions.


Subject(s)
Affect/physiology , Facial Expression , Fixation, Ocular/physiology , Interpersonal Relations , Judgment/physiology , Visual Perception/physiology , Adult , Electroencephalography , Evoked Potentials/physiology , Female , Humans , Male , Time Factors , Young Adult
19.
Brain Res ; 1663: 38-50, 2017 05 15.
Article in English | MEDLINE | ID: mdl-28315309

ABSTRACT

Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation. Task demands modulated amplitudes from 200 to 350ms at occipito-temporal sites spanning the EPN component. Amplitudes were more negative for fearful than neutral expressions starting on N170 from 150 to 350ms, with a temporo-occipital distribution, whereas no clear effect of happy expressions was seen. Task and emotion effects never interacted in any time window or for the ERP components analyzed (P1, N170, EPN). Thus, whether emotion is explicitly discriminated or irrelevant for the task at hand, neural correlates of fearful and happy facial expressions seem immune to these task demands during the first 350ms of visual processing.


Subject(s)
Emotions/physiology , Facial Recognition/physiology , Adult , Electroencephalography/methods , Evoked Potentials/physiology , Face , Facial Expression , Fear/psychology , Female , Happiness , Humans , Male , Reaction Time , Young Adult
20.
Perception ; 46(8): 941-955, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28056652

ABSTRACT

Previous research has shown that gaze direction can only be accurately discriminated within parafoveal limits (∼5° eccentricity) along the horizontal visual field. Beyond this eccentricity, head orientation seems to influence gaze discrimination more than iris cues. The present study examined gaze discrimination performance in the upper visual field (UVF) and lower visual field (LVF), and whether head orientation affects gaze judgments beyond parafoveal vision. Direct and averted gaze faces, in frontal and deviated head orientations, were presented for 150 ms along the vertical meridian while participants maintained central fixation during gaze discrimination judgments. Gaze discrimination was above chance level at all but one eccentricity for the two gaze-head congruent conditions. In contrast, for the incongruent conditions, gaze was discriminated above chance only from -1.5° to +3°, with an asymmetry between the UVF and LVF. Beyond foveal vision, response rates were biased toward head orientation rather than iris eccentricity, occurring in the LVF for both head orientations, and in the UVF for frontal head views. These findings suggest that covert processing of gaze direction involves the integration of eyes and head cues, with congruency of these two social cues driving response differences between the LVF and the UVF.


Subject(s)
Facial Recognition/physiology , Fixation, Ocular , Social Perception , Space Perception/physiology , Visual Fields/physiology , Adolescent , Adult , Eye , Female , Head , Humans , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...