RESUMEN
Distinguishing the direction of another person's eye gaze is extremely important in everyday social interaction, as it provides critical information about people's attention and, therefore, intentions. The temporal dynamics of gaze processing have been investigated using event-related potentials (ERPs) recorded with electroencephalography (EEG). However, the moment at which our brain distinguishes the gaze direction (GD), irrespectively of other facial cues, remains unclear. To solve this question, the present study aimed to investigate the time course of gaze direction processing, using an ERP decoding approach, based on the combination of a support vector machine and error-correcting output codes. We recorded EEG in young healthy subjects, 32 of them performing GD detection and 34 conducting face orientation tasks. Both tasks presented 3D realistic faces with five different head and gaze orientations each: 30°, 15° to the left or right, and 0°. While the classical ERP analyses did not show clear GD effects, ERP decoding analyses revealed that discrimination of GD, irrespective of head orientation, started at 140 ms in the GD task and at 120 ms in the face orientation task. GD decoding accuracy was higher in the GD task than in the face orientation task and was the highest for the direct gaze in both tasks. These findings suggest that the decoding of brain patterns is modified by task relevance, which changes the latency and the accuracy of GD decoding.
Asunto(s)
Electroencefalografía , Potenciales Evocados , Reconocimiento Facial , Fijación Ocular , Humanos , Masculino , Femenino , Adulto Joven , Adulto , Potenciales Evocados/fisiología , Fijación Ocular/fisiología , Reconocimiento Facial/fisiología , Máquina de Vectores de Soporte , Atención/fisiología , Encéfalo/fisiología , Percepción SocialRESUMEN
The affective experience generated when users play computer games can influence their attitude and preference towards the game. Existing evaluation means mainly depend on subjective scales and physiological signals. However, some limitations should not be ignored (e.g. subjective scales are not objective, and physiological signals are complicated). In this paper, we 1) propose a novel method to assess user affective experience when playing single-player games based on pleasure-arousal-dominance (PAD) emotions, facial expressions, and gaze directions, and 2) build an artificial intelligence model to identify user preference. Fifty-four subjects participated in a basketball experiment with three difficulty levels. Their expressions, gaze directions, and subjective PAD emotions were collected and analysed. Experimental results showed that the expression intensities of angry, sad, and neutral, yaw angle degrees of gaze direction, and PAD emotions varied significantly under different difficulties. Besides, the proposed model achieved better performance than other machine-learning algorithms on the collected dataset.
This paper considers the limitations of existing methods for assessing user affective experience when playing computer games. It demonstrates a novel approach using subjective emotion and objective facial cues to identify user affective experience and user preference for the game.
RESUMEN
The cone of gaze is a looker's range of gaze directions that is accepted as direct by an observer. The present research asks how the condition of mild strabismus, that is, when the two eyes point in slightly different directions, influences the cone of gaze. Normally, both eyes are rotated in a coordinated manner such that both eyes are directed to the same fixation point. With strabismus, there are two fixation points, and, therefore, two directions into which the two eyes point. This raises the question of the direction and the shape (i.e., width) of the gaze cone. Two experiments are conducted with simulated mild strabismus. Three conditions are tested, the two strabismic conditions of esotropia, and exotropia and one orthotropic (nonstrabismic) condition. Results show that the direction of the gaze cone is roughly the average of the directions of the two eyes. Furthermore, the width of the gaze cone is not affected by simulated strabismus and is thus the same for the strabismic and the orthotropic conditions. The results imply a model where at first the direction of gaze based on both eyes is perceived, and where the gaze cone is implied on the basis of the combined gaze direction.
Asunto(s)
Ojo , Estrabismo , Humanos , PercepciónRESUMEN
Attachment theory suggests that interindividual differences in attachment security versus insecurity (anxiety and avoidance) contribute to the ways in which people perceive social emotional signals, particularly from the human face. Among different facial features, eye gaze conveys crucial information for social interaction, with a straight gaze triggering different cognitive and emotional processes as compared to an averted gaze. It remains unknown, however, how interindividual differences in attachment associate with early face encoding in the context of a straight versus averted gaze. Using electroencephalography (EEG) and recording event-related potentials (ERPs), specifically the N170 component, the present study (N = 50 healthy adults) measured how the characteristics of attachment anxiety and avoidance relate to the encoding of faces with respect to gaze direction and head orientation. Our findings reveal a significant relationship between gaze direction (irrespective of head orientation) and attachment anxiety on the interhemispheric (i.e. right) asymmetry of the N170 and thus provide evidence for an association between attachment anxiety and eye gaze processing during early visual face encoding.
Asunto(s)
Ansiedad , Apego a Objetos , Adulto , Humanos , Potenciales Evocados , Electroencefalografía/métodos , EmocionesRESUMEN
Eye contact is crucial for the formation and maintenance of social relationships, and plays a key role in facilitating a strong parent-child bond. However, the precise neural and affective mechanisms through which eye contact impacts on parent-child relationships remain elusive. We introduce a task to assess parents' neural and affective responses to prolonged direct and averted gaze coming from their own child, and an unfamiliar child and adult. While in the scanner, 79 parents (n = 44 mothers and n = 35 fathers) were presented with prolonged (16-38 s) videos of their own child, an unfamiliar child, an unfamiliar adult, and themselves (i.e., targets), facing the camera with a direct or an averted gaze. We measured BOLD-responses, tracked parents' eye movements during the videos, and asked them to report on their mood and feelings of connectedness with the targets after each video. Parents reported improved mood and increased feelings of connectedness after prolonged exposure to direct versus averted gaze and these effects were amplified for unfamiliar targets compared to their own child, due to high affect and connectedness ratings after videos of their own child. Neuroimaging results showed that the sight of one's own child was associated with increased activity in middle occipital gyrus, fusiform gyrus and inferior frontal gyrus relative to seeing an unfamiliar child or adult. While we found no robust evidence of specific neural correlates of eye contact (i.e., contrast direct > averted gaze), an exploratory parametric analysis showed that dorsomedial prefrontal cortex (dmPFC) activity increased linearly with duration of eye contact (collapsed across all "other" targets). Eye contact-related dmPFC activity correlated positively with increases in feelings of connectedness, suggesting that this region may drive feelings of connectedness during prolonged eye contact with others. These results underline the importance of prolonged eye contact for affiliative processes and provide first insights into its neural correlates. This may pave the way for new research in individuals or pairs in whom affiliative processes are disrupted.
Asunto(s)
Mapeo Encefálico , Movimientos Oculares , Adolescente , Adulto , Emociones/fisiología , Expresión Facial , Femenino , Fijación Ocular , Humanos , Lóbulo TemporalRESUMEN
Non-verbal cues tone our communication. Previous studies found that non-verbal factors, such as spatial distance and gaze direction, significantly impact interpersonal communication. However, little is known about the behind multi-brain neural correlates and whether it could affect high-level creative group communication. Here, we provided a new, scalable, and neuro-based approach to explore the effects of non-verbal factors on different communication tasks, and revealed the underlying multi-brain neural correlates using fNIRS-based hyperscanning technique. Across two experiments, we found that closer spatial distance and more direct gaze angle could promote collaborative behaviors, improve both creative and non-creative communication outcomes, and enhance inter-brain neural synchronization. Moreover, compared to the non-creative communication task, participants' inter-brain network was more intertwined when performing the creative communication task. These findings suggest that close spatial distance and direct gaze serve as positive social cues, bringing interacting brains into alignment and optimizing inter-brain information transfer, thus improving communication outcomes.
Asunto(s)
Mapeo Encefálico , Relaciones Interpersonales , Encéfalo , Mapeo Encefálico/métodos , Comunicación , Humanos , Redes Neurales de la ComputaciónRESUMEN
Emotional mimicry plays an important role in social interaction and is influenced by social context, especially eye gaze direction. However, the neural mechanism underlying the effect of eye gaze direction on emotional mimicry is unclear. Here, we explored how eye gaze direction influenced emotional mimicry with a combination of electromyography (EMG) and electroencephalography (EEG) techniques, which may provide a more comprehensive measure. To do this, we recorded facial EMG and scalp EEG signals simultaneously while participants observed emotional faces (happy vs. angry) with direct or averted gaze. Then, we split the EEG trials into two mimicry intensity categories (high mimicry intensity, HMI vs. low mimicry intensity, LMI) according to EMG activity. The ERP difference between HMI and LMI EEG trials revealed four ERP components (P50, P150, N200 and P300), and the effect of eye gaze direction on emotional mimicry was prominent on P300 at P7 and P8. Moreover, we also observed differences in the effect of eye gaze direction on mimicry of happy faces and angry faces, which were found on P300 at P7, as well as P150 at P7 and N200 at P7 and Pz. In short, the present study isolated the neural signals of emotional mimicry with a new multimodal method, and provided empirical neural evidence that eye gaze direction affected emotional mimicry.
Asunto(s)
Electroencefalografía/métodos , Electromiografía/métodos , Emociones/fisiología , Fijación Ocular/fisiología , Conducta Imitativa/fisiología , Encéfalo/fisiología , Expresión Facial , Femenino , Humanos , Masculino , Interacción Social , Adulto JovenRESUMEN
Understanding others' intentions requires both the identification of social cues (e.g., emotional facial expressions, gaze direction) and the attribution of a mental state to another. The neural substrates of these processes have often been studied separately, and results are heterogeneous, in part attributable to the variety of paradigms used. The aim of the present study was to explore the neural regions underlying these sociocognitive processes, using a novel naturalistic task in which participants engage with human protagonists featured in videos. A total of 51 right-handed volunteers underwent functional magnetic resonance imaging while performing the Dynamic Inference Task (DIT), manipulating the degree of inference (high vs. low), the presence of emotion (emotional vs. nonemotional), and gaze direction (direct vs. averted). High nonemotional inference elicited neural activation in temporal regions encompassing the right posterior superior temporal sulcus. The presence (vs. absence) of emotion in the high-inference condition elicited a bilateral pattern of activation in internal temporal areas around the amygdala and orbitofrontal structures, as well as activation in the right dorsomedial part of the superior frontal gyrus and the left precuneus. On account of its dynamic, naturalistic approach, the DIT seems a suitable task for exploring social interactions and the way we interact with others, both in nonclinical and clinical populations.
Asunto(s)
Encéfalo/fisiología , Mentalización/fisiología , Cognición Social , Adulto , Mapeo Encefálico , Señales (Psicología) , Emociones/fisiología , Empatía , Expresión Facial , Femenino , Humanos , Imagen por Resonancia Magnética , Masculino , Persona de Mediana Edad , Percepción Social , Teoría de la Mente/fisiología , Adulto JovenRESUMEN
Looking at the eyes informs us about the thoughts and emotions of those around us, and impacts our own emotional state. However, it is unknown how perceiving direct and averted gaze impacts our ability to share the gazer's positive and negative emotions, abilities referred to as positive and negative affective empathy. We presented 44 participants with contextual sentences describing positive, negative and neutral events happening to other people (e.g. "Her newborn was saved/killed/fed yesterday afternoon."). These were designed to elicit positive, negative, or little to no empathy, and were followed by direct or averted gaze images of the individuals described. Participants rated their affective empathy for the individual and their own emotional valence on each trial. Event-related potentials time-locked to face-onset and associated with empathy and emotional processing were recorded to investigate whether they were modulated by gaze direction. Relative to averted gaze, direct gaze was associated with increased positive valence in the positive and neutral conditions and with increased positive empathy ratings. A similar pattern was found at the neural level, using robust mass-univariate statistics. The N100, thought to reflect an automatic activation of emotion areas, was modulated by gaze in the affective empathy conditions, with opposite effect directions in positive and negative conditions.. The P200, an ERP component sensitive to positive stimuli, was modulated by gaze direction only in the positive empathy condition. Positive and negative trials were processed similarly at the early N200 processing stage, but later diverged, with only negative trials modulating the EPN, P300 and LPP components. These results suggest that positive and negative affective empathy are associated with distinct time-courses, and that perceived gaze direction uniquely modulates positive empathy, highlighting the importance of studying empathy with face stimuli.
Asunto(s)
Encéfalo/fisiología , Emociones/fisiología , Empatía/fisiología , Potenciales Evocados/fisiología , Fijación Ocular , Adolescente , Electroencefalografía/métodos , Expresión Facial , Femenino , Humanos , Masculino , Percepción Visual/fisiologíaRESUMEN
Most visually guided animals shift their gaze using body movements, eye movements, or both to gather information selectively from their environments. Psychological studies of eye movements have advanced our understanding of perceptual and cognitive processes that mediate visual attention in humans and other vertebrates. However, much less is known about how these processes operate in other organisms, particularly invertebrates. We here make the case that studies of invertebrate cognition can benefit by adding precise measures of gaze direction. To accomplish this, we briefly review the human visual attention literature and outline four research themes and several experimental paradigms that could be extended to invertebrates. We briefly review selected studies where the measurement of gaze direction in invertebrates has provided new insights, and we suggest future areas of exploration.
Asunto(s)
Cognición/fisiología , Invertebrados/fisiología , AnimalesRESUMEN
Animals must selectively attend to relevant stimuli and avoid being distracted by unimportant stimuli. Jumping spiders (Salticidae) do this by coordinating eyes with different capabilities. Objects are examined by a pair of high-acuity principal eyes, whose narrow field of view is compensated for by retinal movements. The principal eyes overlap in field of view with motion-sensitive anterior-lateral eyes (ALEs), which direct their gaze to new stimuli. Using a salticid-specific eyetracker, we monitored the gaze direction of the principal eyes as they examined a primary stimulus. We then presented a distractor stimulus visible only to the ALEs and observed whether the principal eyes reflexively shifted their gaze to it or whether this response was flexible. Whether spiders redirected their gaze to the distractor depended on properties of both the primary and distractor stimuli. This flexibility suggests that higher-order processing occurs in the management of the attention of the principal eyes.
Asunto(s)
Percepción de Movimiento , Arañas , Animales , Atención , Movimiento , RetinaRESUMEN
Socially anxious people have a malfunction in attentional systems. However, it is uncertain whether the malfunction of the attentional system is a domain-specific process to social stimuli or a domain-general process to non-social stimuli. Therefore, we investigated the effects of social anxiety on the domain specificity of the attentional process using a spatial Stroop paradigm. We conducted two identical experiments with a total of 153 university students including men and women (61 students in Experiment 1 and 92 students in Experiment 2), in which the levels of social anxiety were assessed using specific instruments. The results showed that social anxiety scores were negatively correlated with the reversed spatial Stroop effect for social stimuli, but not for non-social stimuli (Experiment 1). The findings of the first experiment were successfully replicated in Experiment 2. Our results suggested that the malfunction of the attentional system is a domain-specific process to socially threatening stimuli in socially anxious individuals.
Asunto(s)
Atención , Expresión Facial , Ansiedad , Trastornos de Ansiedad , Miedo , Femenino , Humanos , MasculinoRESUMEN
In the early 19th century, William H. Wollaston impressed the Royal Society of London with engravings of portraits. He manipulated facial features, such as the nose, and thereby dramatically changed the perceived gaze direction, although the eye region with iris and eye socket had remained unaltered. This Wollaston illusion has been replicated numerous times but never with the original stimuli. We took the eyes (pupil and iris) from Wollaston's most prominent engraving and measured their perceived gaze direction in an analog fashion. We then systematically added facial features (eye socket, eyebrows, nose, skull, and hair). These features had the power to divert perceived gaze direction by up to 20°, which confirms Wollaston's phenomenal observation. The effect can be thought of as an attractor effect, that is, cues that indicate a slight change in head orientation have the power to divert perceived gaze direction.
Asunto(s)
Señales (Psicología) , Reconocimiento Facial/fisiología , Fijación Ocular/fisiología , Ilusiones/fisiología , Percepción Espacial/fisiología , Adolescente , Adulto , Femenino , Humanos , Masculino , Retratos como Asunto , Adulto JovenRESUMEN
Earlier research suggested that gaze direction has an impact on cognitive processing. It is likely that horizontal gaze direction increases activation in specific areas of the contralateral cerebral hemisphere. Consistent with the lateralization of memory functions, we previously showed that shifting gaze to the left improves visuo-spatial short-term memory. In the current study, we investigated the effect of unilateral gaze on verbal processing. We expected better performance with gaze directed to the right because language is lateralized in the left hemisphere. Also, an advantage of gaze directed upward was expected because local processing and object recognition are facilitated in the upper visual field. Observers directed their gaze at one of the corners of the computer screen while they performed lexical decision, grammatical gender and semantic discrimination tasks. Contrary to expectations, we did not observe performance differences between gaze directed to the left or right, which is consistent with the inconsistent literature on horizontal asymmetries with verbal tasks. However, RTs were shorter when observers looked at words in the upper compared to the lower part of the screen, suggesting that looking upwards enhances verbal processing.
Asunto(s)
Fijación Ocular/fisiología , Lateralidad Funcional/fisiología , Memoria a Corto Plazo/fisiología , Conducta Verbal/fisiología , Adolescente , Adulto , Femenino , Humanos , Masculino , Recuerdo Mental/fisiología , Pruebas Neuropsicológicas , Tiempo de Reacción/fisiología , Campos Visuales/fisiología , Adulto JovenRESUMEN
In this paper, we analyze the relationship between head and chest movements and gaze direction in both walking and non-walking conditions. In a different approach from existing studies, we aim to analyze behavior when humans intentionally gaze at a certain target from two perspectives: (1) the relationship between gaze and body movements and (2) the effects of walking on body motion. We performed three experiments: fixed target scenes (Experiment 1), moving target scenes (Experiment 2) and more realistic gazing scenes (Experiment 3). The experimental results showed a linear relationship between the head and chest directions and gaze directions regardless of walking, non-walking situations, or target movements, and stronger gaze-head correlations than gaze-chest correlations. Further, we found effects of walking that constrained rotational body movements, and that body parts with larger moments were easily affected by walking. These results suggest that the findings of existing studies in non-walking situations may be applicable to walking situations directly or with simple modifications.
Asunto(s)
Fijación Ocular/fisiología , Movimientos de la Cabeza/fisiología , Desempeño Psicomotor/fisiología , Tórax/fisiología , Percepción Visual/fisiología , Caminata/fisiología , Adulto , Humanos , Percepción de Movimiento/fisiología , Adulto JovenRESUMEN
In this research, we investigated whether appraisals of faces follow distinct rules of information integration under arousing versus non-arousing conditions. Support for this prediction was found in four experiments in which participants observed angry (and fearful) faces that were presented with a direct versus an averted gaze (Experiments 1a, b), on a red versus a grey background (Experiment 2), and after performing a motor exercise versus no exercise (Experiment 3). Under arousing conditions, participants' appraisals of faces reflected summation (i.e. extremely negative encounters were strengthened by moderately negative encounters) whereas, under non-arousing conditions, appraisals did not reflect summation (i.e. extremely negative encounters were weakened by moderately negative encounters) and could instead be accounted for by three alternative rules of information integration based on averaging, mere exposure, or the number of strong stimuli.
Asunto(s)
Ira/fisiología , Nivel de Alerta/fisiología , Expresión Facial , Miedo/psicología , Fijación Ocular/fisiología , Estimulación Luminosa/métodos , Adulto , Ejercicio Físico/psicología , Miedo/fisiología , Femenino , Humanos , Masculino , Estudiantes/psicología , Adulto JovenRESUMEN
The main challenge in decoding neural representations lies in linking neural activity to representational content or abstract concepts. The transformation from a neural-based to a low-dimensional representation may hold the key to encoding perceptual processes in the human brain. In this study, we developed a novel model by which to represent two changeable features of faces: face viewpoint and gaze direction. These features are embedded in spatiotemporal brain activity derived from magnetoencephalographic data. Our decoding results demonstrate that face viewpoint and gaze direction can be represented by manifold structures constructed from brain responses in the bilateral occipital face area and right superior temporal sulcus, respectively. Our results also show that the superposition of brain activity in the manifold space reveals the viewpoints of faces as well as directions of gazes as perceived by the subject. The proposed manifold representation model provides a novel opportunity to gain further insight into the processing of information in the human brain.
Asunto(s)
Atención/fisiología , Mapeo Encefálico , Encéfalo/fisiología , Cara , Magnetoencefalografía , Reconocimiento Visual de Modelos/fisiología , Adulto , Encéfalo/diagnóstico por imagen , Femenino , Lateralidad Funcional , Humanos , Procesamiento de Imagen Asistido por Computador , Imagen por Resonancia Magnética , Masculino , Estimulación Luminosa , Análisis de Componente Principal , Tiempo de Reacción/fisiología , Adulto JovenRESUMEN
Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition, a single stimulus was delivered on participants' hand, while the reflexion of the right hand was seen through the mirror; in the object condition, participants looked at a box through the mirror, and in the right hand condition, participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximo-distal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points' locations, by elongating it, in the radio-ulnar axis.
Asunto(s)
Dedos/fisiología , Fijación Ocular/fisiología , Percepción Espacial/fisiología , Percepción del Tacto/fisiología , Percepción Visual/fisiología , Adulto , Femenino , Humanos , Masculino , Adulto JovenRESUMEN
While there is an extensive literature on the tendency to mimic emotional expressions in adults, it is unclear how this skill emerges and develops over time. Specifically, it is unclear whether infants mimic discrete emotion-related facial actions, whether their facial displays are moderated by contextual cues and whether infants' emotional mimicry is constrained by developmental changes in the ability to discriminate emotions. We therefore investigate these questions using Baby-FACS to code infants' facial displays and eye-movement tracking to examine infants' looking times at facial expressions. Three-, 7-, and 12-month-old participants were exposed to dynamic facial expressions (joy, anger, fear, disgust, sadness) of a virtual model which either looked at the infant or had an averted gaze. Infants did not match emotion-specific facial actions shown by the model, but they produced valence-congruent facial responses to the distinct expressions. Furthermore, only the 7- and 12-month-olds displayed negative responses to the model's negative expressions and they looked more at areas of the face recruiting facial actions involved in specific expressions. Our results suggest that valence-congruent expressions emerge in infancy during a period where the decoding of facial expressions becomes increasingly sensitive to the social signal value of emotions.
Asunto(s)
Desarrollo Infantil/fisiología , Emociones/fisiología , Movimientos Oculares/fisiología , Expresión Facial , Señales (Psicología) , Femenino , Humanos , Lactante , Masculino , Estimulación LuminosaRESUMEN
The effect of eye contact on self-awareness was investigated with implicit measures based on the use of first-person singular pronouns in sentences. The measures were proposed to tap into self-referential processing, that is, information processing associated with self-awareness. In addition, participants filled in a questionnaire measuring explicit self-awareness. In Experiment 1, the stimulus was a video clip showing another person and, in Experiment 2, the stimulus was a live person. In both experiments, participants were divided into two groups and presented with the stimulus person either making eye contact or gazing downward, depending on the group assignment. During the task, the gaze stimulus was presented before each trial of the pronoun-selection task. Eye contact was found to increase the use of first-person pronouns, but only when participants were facing a real person, not when they were looking at a video of a person. No difference in self-reported self-awareness was found between the two gaze direction groups in either experiment. The results indicate that eye contact elicits self-referential processing, but the effect may be stronger, or possibly limited to, live interaction.