RESUMEN
During infancy, intersensory facilitation declines gradually as unisensory perception develops. However, this trade-off was mainly investigated using audiovisual stimulations. Here, fifty 4- to 12-month-old infants (26 females, predominately White) were tested in 2017-2020 to determine whether the facilitating effect of their mother's body odor on neural face categorization, as previously observed at 4 months, decreases with age. In a baseline odor context, the results revealed a face-selective electroencephalographic response that increases and changes qualitatively between 4 and 12 months, marking improved face categorization. At the same time, the benefit of adding maternal odor fades with age (R2 = .31), indicating an inverse relation with the amplitude of the visual response, and generalizing to olfactory-visual interactions previous evidence from the audiovisual domain.
RESUMEN
Understanding how the young infant brain starts to categorize the flurry of ambiguous sensory inputs coming in from its complex environment is of primary scientific interest. Here, we test the hypothesis that senses other than vision play a key role in initiating complex visual categorizations in 20 4-mo-old infants exposed either to a baseline odor or to their mother's odor while their electroencephalogram (EEG) is recorded. Various natural images of objects are presented at a 6-Hz rate (six images/second), with face-like object configurations of the same object categories (i.e., eliciting face pareidolia in adults) interleaved every sixth stimulus (i.e., 1 Hz). In the baseline odor context, a weak neural categorization response to face-like stimuli appears at 1 Hz in the EEG frequency spectrum over bilateral occipitotemporal regions. Critically, this face-like-selective response is magnified and becomes right lateralized in the presence of maternal body odor. This reveals that nonvisual cues systematically associated with human faces in the infant's experience shape the interpretation of face-like configurations as faces in the right hemisphere, dominant for face categorization. At the individual level, this intersensory influence is particularly effective when there is no trace of face-like categorization in the baseline odor context. These observations provide evidence for the early tuning of face-(like)-selective activity from multisensory inputs in the developing brain, suggesting that perceptual development integrates information across the senses for efficient category acquisition, with early maturing systems such as olfaction driving the acquisition of categories in later-developing systems such as vision.
Asunto(s)
Encéfalo/fisiología , Reconocimiento Facial/fisiología , Odorantes , Visión Ocular/fisiología , Encéfalo/diagnóstico por imagen , Mapeo Encefálico , Electroencefalografía , Femenino , Humanos , Lactante , Masculino , Estimulación LuminosaRESUMEN
Visual categorization is the brain ability to rapidly and automatically respond to a certain category of inputs. Whether category-selective neural responses are purely visual or can be influenced by other sensory modalities remains unclear. Here, we test whether odors modulate visual categorization, expecting that odors facilitate the neural categorization of congruent visual objects, especially when the visual category is ambiguous. Scalp electroencephalogram (EEG) was recorded while natural images depicting various objects were displayed in rapid 12-Hz streams (i.e., 12 images / second) and variable exemplars of a target category (either human faces, cars, or facelike objects in dedicated sequences) were interleaved every 9th stimulus to tag category-selective responses at 12/9 = 1.33 Hz in the EEG frequency spectrum. During visual stimulation, participants (N = 26) were implicitly exposed to odor contexts (either body, gasoline or baseline odors) and performed an orthogonal cross-detection task. We identify clear category-selective responses to every category over the occipito-temporal cortex, with the largest response for human faces and the lowest for facelike objects. Critically, body odor boosts the response to the ambiguous facelike objects (i.e., either perceived as nonface objects or faces) over the right hemisphere, especially for participants reporting their presence post-stimulation. By contrast, odors do not significantly modulate other category-selective responses, nor the general visual response recorded at 12 Hz, revealing a specific influence on the categorization of congruent ambiguous stimuli. Overall, these findings support the view that the brain actively uses cues from the different senses to readily categorize visual inputs, and that olfaction, which has long been considered as poorly functional in humans, is well placed to disambiguate visual information.
Asunto(s)
Mapeo Encefálico , Olfato , Encéfalo/fisiología , Electroencefalografía , Humanos , Odorantes , Estimulación Luminosa/métodosRESUMEN
Humans exhibit a marked specialization to process the most experienced facial morphologies. In particular, nonhuman primate faces are poorly discriminated compared to human faces in behavioral tasks. So far however, a clear and consistent marker that quantifies our expertise in human over monkey face discrimination directly from brain activity is lacking. Here, using scalp electroencephalography (EEG), we isolate a direct signature of individuation abilities for human and nonhuman (i.e., macaque faces) primate faces. Human or monkey faces were rapidly presented at a base rate of 12â¯Hz in upright or inverted orientations while participants performed an orthogonal behavioral task. In each stimulation sequence, eight face images of one individual were used as base stimuli, while images of other individuals were briefly introduced every 9th stimulus to quantify an identity-change response at 1.33â¯Hz and harmonics (i.e., integer multiples) in the EEG frequency spectrum. The brain response to upright human faces was twice as large as to monkey faces, and reduced following picture-plane inversion for human faces only. This reflects the disruption of high-level face identity discrimination developed for the canonical upright human face. No difference was observed between upright monkey faces and inverted human faces, suggesting non-expert visual processes for those two face formats associated with little experience. In addition, the size of the inversion effect for human, but not monkey faces, was predictive of the expertise effect (i.e., difference between upright human and monkey faces) at the individual level. This result suggests a selective ability to discriminate human faces that does not contribute to the individuation of other unexperienced face morphologies such as monkey faces. Overall, these findings indicate that human expertise for conspecific face discrimination can be isolated and quantified in individual human brains.
Asunto(s)
Corteza Cerebral/fisiología , Discriminación en Psicología/fisiología , Reconocimiento Facial/fisiología , Práctica Psicológica , Percepción Espacial/fisiología , Adulto , Electroencefalografía , Femenino , Humanos , Masculino , Adulto JovenRESUMEN
To successfully interact with a rich and ambiguous visual environment, the human brain learns to differentiate visual stimuli and to produce the same response to subsets of these stimuli despite their physical difference. Although this visual categorization function is traditionally investigated from a unisensory perspective, its early development is inherently constrained by multisensory inputs. In particular, an early-maturing sensory system such as olfaction is ideally suited to support the immature visual system in infancy by providing stability and familiarity to a rapidly changing visual environment. Here, we test the hypothesis that rapid visual categorization of salient visual signals for the young infant brain, human faces, is shaped by another highly relevant human-related input from the olfactory system, the mother's body odor. We observe that a right-hemispheric neural signature of single-glance face categorization from natural images is significantly enhanced in the maternal versus a control odor context in individual 4-month-old infant brains. A lack of difference between odor conditions for the common brain response elicited by both face and non-face images rules out a mere enhancement of arousal or visual attention in the maternal odor context. These observations show that face-selective neural activity in infancy is mediated by the presence of a (maternal) body odor, providing strong support for multisensory inputs driving category acquisition in the developing human brain and having important implications for our understanding of human perceptual development.
Asunto(s)
Encéfalo/fisiología , Madres , Odorantes , Atención/fisiología , Mapeo Encefálico , Desarrollo Infantil/fisiología , Reconocimiento Facial , Femenino , Humanos , Lactante , Masculino , Olfato/fisiologíaRESUMEN
Little is known about the effects of olfaction on visual processing during infancy. We investigated whether and how an infant's own mother's body odor or another mother's body odor affects 4-month-old infants' looking at their mother's face when it is paired with a stranger's face. In Experiment 1, infants were exposed to their mother's body odor or to a control odor, while in Experiment 2, infants were exposed to a stranger mother's body odor while their visual preferences were recorded. Results revealed that infants looked more at the stranger's female face in presence of the control odor but that they looked more at their mother's face in the context of any mother's body odors. This effect was due to a reduction of looking at the stranger's face. These findings suggest that infants react similarly to the body odor of any mother and add to the growing body of evidence indicating that olfactory stimulation represents a pervasive aspect of infant multisensory perception.
Asunto(s)
Cara , Relaciones Madre-Hijo , Odorantes , Análisis de Varianza , Femenino , Humanos , Lactante , Masculino , Madres , Estimulación LuminosaRESUMEN
Efficient decoding of even brief and slight intensity facial expression changes is important for social interactions. However, robust evidence for the human brain ability to automatically detect brief and subtle changes of facial expression remains limited. Here we built on a recently developed paradigm in human electrophysiology with full-blown expressions (Dzhelyova et al., 2017), to isolate and quantify a neural marker for the detection of brief and subtle changes of facial expression. Scalp electroencephalogram (EEG) was recorded from 18 participants during stimulation of a neutral face changing randomly in size at a rapid rate of 6â¯Hz. Brief changes of expression appeared every five stimulation cycle (i.e., at 1.2â¯Hz) and expression intensity increased parametrically every 20â¯s in 20% steps during sweep sequences of 100â¯s. A significant 1.2â¯Hz response emerged in the EEG spectrum already at 40% of facial expression-change intensity for most of the 5 emotions tested (anger, disgust, fear, happiness, or sadness in different sequences), and increased with intensity steps, predominantly over right occipito-temporal regions. Given the high signal-to-noise ratio of the approach, thresholds for automatic detection of brief changes of facial expression could be determined for every single individual brain. A time-domain analysis revealed three components, the two first increasing linearly with increasing intensity as early as 100â¯ms after a change of expression, suggesting gradual low-level image-change detection prior to visual coding of facial movements. In contrast, the third component showed abrupt sensitivity to increasing expression intensity beyond 300â¯ms post expression-change, suggesting categorical emotion perception. Overall, this characterization of the detection of subtle changes of facial expression and its temporal dynamics open promising tracks for precise assessment of social perception ability during development and in clinical populations.
Asunto(s)
Encéfalo/fisiología , Expresión Facial , Reconocimiento Visual de Modelos/fisiología , Percepción Social , Adulto , Electroencefalografía , Femenino , Humanos , Masculino , Estimulación Luminosa , Adulto JovenRESUMEN
While there is an extensive literature on the tendency to mimic emotional expressions in adults, it is unclear how this skill emerges and develops over time. Specifically, it is unclear whether infants mimic discrete emotion-related facial actions, whether their facial displays are moderated by contextual cues and whether infants' emotional mimicry is constrained by developmental changes in the ability to discriminate emotions. We therefore investigate these questions using Baby-FACS to code infants' facial displays and eye-movement tracking to examine infants' looking times at facial expressions. Three-, 7-, and 12-month-old participants were exposed to dynamic facial expressions (joy, anger, fear, disgust, sadness) of a virtual model which either looked at the infant or had an averted gaze. Infants did not match emotion-specific facial actions shown by the model, but they produced valence-congruent facial responses to the distinct expressions. Furthermore, only the 7- and 12-month-olds displayed negative responses to the model's negative expressions and they looked more at areas of the face recruiting facial actions involved in specific expressions. Our results suggest that valence-congruent expressions emerge in infancy during a period where the decoding of facial expressions becomes increasingly sensitive to the social signal value of emotions.
Asunto(s)
Desarrollo Infantil/fisiología , Emociones/fisiología , Movimientos Oculares/fisiología , Expresión Facial , Señales (Psicología) , Femenino , Humanos , Lactante , Masculino , Estimulación LuminosaRESUMEN
Recognition of emotional facial expressions is a crucial skill for adaptive behavior. Past research suggests that at 5 to 7 months of age, infants look longer to an unfamiliar dynamic angry/happy face which emotionally matches a vocal expression. This suggests that they can match stimulations of distinct modalities on their emotional content. In the present study, olfaction-vision matching abilities were assessed across different age groups (3, 5 and 7 months) using dynamic expressive faces (happy vs. disgusted) and distinct hedonic odor contexts (pleasant, unpleasant and control) in a visual-preference paradigm. At all ages the infants were biased toward the disgust faces. This visual bias reversed into a bias for smiling faces in the context of the pleasant odor context in the 3-month-old infants. In infants aged 5 and 7 months, no effect of the odor context appeared in the present conditions. This study highlights the role of the olfactory context in the modulation of visual behavior toward expressive faces in infants. The influence of olfaction took the form of a contingency effect in 3-month-old infants, but later evolved to vanish or to take another form that could not be evidenced in the present study.
Asunto(s)
Emociones , Medidas del Movimiento Ocular , Expresión Facial , Odorantes , Cara , Femenino , Humanos , Lactante , Masculino , OlfatoRESUMEN
Horizontal information is crucial to face processing in adults. Yet the ontogeny of this preferential type of processing remains unknown. To clarify this issue, we tested 3-month-old infants' sensitivity to horizontal information within faces. Specifically, infants were exposed to the simultaneous presentation of a face and a car presented in upright or inverted orientation while their looking behavior was recorded. Face and car images were either broadband (UNF) or filtered to only reveal horizontal (H), vertical (V) or this combined information (HV). As expected, infants looked longer at upright faces than at upright cars, but critically, only when horizontal information was preserved in the stimulus (UNF, HV, H). These results first indicate that horizontal information already drives upright face processing at 3 months of age. They also recall the importance, for infants, of some facial features, arranged in a top-heavy configuration, particularly revealed by this band of information. © 2016 Wiley Periodicals, Inc. Dev Psychobiol 58: 536-542, 2016.
Asunto(s)
Desarrollo Infantil/fisiología , Reconocimiento Facial/fisiología , Reconocimiento Visual de Modelos/fisiología , Percepción Social , Percepción Espacial/fisiología , Femenino , Humanos , Lactante , MasculinoRESUMEN
Difficulties in the recognition of emotions in expressive faces have been reported in people with 22q11.2 deletion syndrome (22q11.2DS). However, while low-intensity expressive faces are frequent in everyday life, nothing is known about their ability to perceive facial emotions depending on the intensity of expression. Through a visual matching task, children and adolescents with 22q11.2DS as well as gender- and age-matched healthy participants were asked to categorise the emotion of a target face among six possible expressions. Static pictures of morphs between neutrality and expressions were used to parametrically manipulate the intensity of the target face. In comparison to healthy controls, results showed higher perception thresholds (i.e. a more intense expression is needed to perceive the emotion) and lower accuracy for the most expressive faces indicating reduced categorisation abilities in the 22q11.2DS group. The number of intrusions (i.e. each time an emotion is perceived as another one) and a more gradual perception performance indicated smooth boundaries between emotional categories. Correlational analyses with neuropsychological and clinical measures suggested that reduced visual skills may be associated with impaired categorisation of facial emotions. Overall, the present study indicates greater difficulties for children and adolescents with 22q11.2DS to perceive an emotion in low-intensity expressive faces. This disability is subtended by emotional categories that are not sharply organised. It also suggests that these difficulties may be associated with impaired visual cognition, a hallmark of the cognitive deficits observed in the syndrome. These data yield promising tracks for future experimental and clinical investigations.
Asunto(s)
Síndrome de DiGeorge/psicología , Expresión Facial , Reconocimiento Visual de Modelos/fisiología , Adolescente , Estudios de Casos y Controles , Niño , Trastornos del Conocimiento/etiología , Síndrome de DiGeorge/complicaciones , Emociones , Femenino , Felicidad , Humanos , Masculino , Estimulación Luminosa , Percepción SocialRESUMEN
INTRODUCTION: Many studies have shown that recollection process is impaired in patients with schizophrenia, whereas familiarity is generally spared. However, in these studies, the Receiver Operating Characteristic (ROC) presented is average ROC likely to mask individual differences. METHODS: In the present study using a face-recognition task, we computed the individual ROC of patients with schizophrenia and control participants. Each group was divided into two subgroups on the basis of the type of recognition processes implemented: recognition based on familiarity only and recognition based on familiarity and recollection. RESULTS: The recognition performance of the schizophrenia patients was below that of the control participants only when recognition was based solely on familiarity. For the familiarity-alone patients, the score obtained on the Scale for the Assessment of Positive Symptoms (SAPS) was correlated with the variance of the old-face familiarity. For the familiarity-recollection patients, the score obtained on the Scale for the Assessment of Negative Symptoms (SANS) was correlated with the decision criterion and with the old-face recollection probability. CONCLUSIONS: These results show that one cannot ascribe the impaired recognition observed in patients with schizophrenia to a recollection deficit alone. These results show that individual ROC can be used to distinguish between subtypes of schizophrenia and could serve as a basis for setting up specific cognitive remediation therapy for individuals with schizophrenia.
Asunto(s)
Cara , Trastornos de la Memoria/diagnóstico , Trastornos de la Memoria/psicología , Recuerdo Mental , Reconocimiento en Psicología , Psicología del Esquizofrénico , Adulto , Femenino , Francia , Humanos , Inteligencia , Masculino , Persona de Mediana Edad , Curva ROC , EsquizofreniaRESUMEN
This study examined the sensory profile of three groups of children (those with visual impairment, typical development or autism spectrum disorder) aged 3 to 12. The principal aim was to find out whether the Sensory Profile (SP) of children with visual impairment was a good predictor of behaviors typical of ASD. The data was collected through a sensory profile filled out by parents of 37 visually impaired children, 30 with autism spectrum disorder (ASD) and 42 with typical development (TD). To assess the risk of ASD, the Social Communication Questionnaire (SCQ) was also administered. The results indicate that children with visual impairment are at increased risk of exhibiting signs of ASD, and that the sensory profile is a good predictor of risk of autistic signs in children with visual impairment. This study provides for the first time strong evidence for the need to systematically assess the sensory profile in children with visual impairment.
RESUMEN
In the context of blindness, studies on the recognition of facial expressions of emotions by touch are essential to define the compensatory touch abilities and to create adapted tools on emotions. This study is the first to examine the effect of visual experience in the recognition of tactile drawings of facial expressions of emotions by children with different visual experiences. To this end, we compared the recognition rates of tactile drawings of emotions between blind children, children with low vision and sighted children aged 6-12 years. Results revealed no effect of visual experience on recognition rates. However, an effect of emotions and an interaction effect between emotions and visual experience were found. Indeed, while all children had a low average recognition rate, the drawings of fear, anger and disgust were particularly poorly recognized. Moreover, sighted children were significantly better at recognizing the drawings of surprise and sadness than the blind children who only showed high recognition rates for joy. The results of this study support the importance of developing emotion tools that can be understood by children with different visual experiences.
Asunto(s)
Ceguera , Emociones , Expresión Facial , Humanos , Niño , Masculino , Femenino , Ceguera/fisiopatología , Ceguera/psicología , Emociones/fisiología , Baja Visión/fisiopatología , Reconocimiento en Psicología/fisiología , Percepción del Tacto/fisiología , Reconocimiento Facial/fisiologíaRESUMEN
We investigated the psychophysical factors underlying the identity-emotion interaction in face perception. Visual field and sex were also taken into account. Participants had to judge whether a probe face, presented in either the left or the right visual field, and a central target face belonging to same person while emotional expression varied (Experiment 1) or to judge whether probe and target faces expressed the same emotion while identity was manipulated (Experiment 2). For accuracy we replicated the mutual facilitation effect between identity and emotion; no sex or hemispheric differences were found. Processing speed measurements, however, showed a lesser degree of interference in women than in men, especially for matching identity when faces expressed different emotions after a left visual presentation probe face. Psychophysical indices can be used to determine whether these effects are perceptual (A') or instead arise at a post-perceptual decision-making stage (B"). The influence of identity on the processing of facial emotion seems to be due to perceptual factors, whereas the influence of emotion changes on identity processing seems to be related to decisional factors. In addition, men seem to be more "conservative" after a LVF/RH probe-face presentation when processing identity. Women seem to benefit from better abilities to extract facial invariant aspects relative to identity.
Asunto(s)
Discriminación en Psicología/fisiología , Cara , Expresión Facial , Reconocimiento Visual de Modelos/fisiología , Caracteres Sexuales , Campos Visuales/fisiología , Adolescente , Adulto , Sesgo , Femenino , Lateralidad Funcional , Humanos , Masculino , Estimulación Luminosa , Psicofísica , Tiempo de Reacción , Adulto JovenRESUMEN
In the current study, we examined the role of task-related top-down mechanisms in the recognition of facial expressions. An expression of increasing intensity was displayed at a frequency of 1.5 Hz among the neutral faces of the same model that was displayed at a frequency of 12 Hz (i.e., 12 frames per second, with the expression occurring every eight frames). Twenty-two participants were asked either to recognize the emotion at the expression-specific frequency (1.5 Hz) or to perform an orthogonal task in separate blocks, while a scalp electroencephalogram (EEG) was recorded. A significant 1.5 Hz response emerged with the increase in expressive intensity over the medial occipital, right and left occipitotemporal, and centro-frontal regions. In these three regions, the magnitude of this response was greater when participants were involved in expression recognition, especially when the intensity of expression was low and ambiguous. Time-domain analysis revealed that engagement in the explicit recognition of facial expression caused a modulation of the response even before the onset of the expression over centro-frontal regions. The response was then amplified over the medial occipital and right and left occipitotemporal regions. Overall, the procedure developed in the present study allowed us to document different stages of the voluntary recognition of facial expressions, from detection to recognition, through the implementation of task-related top-down mechanisms that modulated the incoming information flow. (PsycInfo Database Record (c) 2023 APA, all rights reserved).
Asunto(s)
Expresión Facial , Reconocimiento Facial , Humanos , Electroencefalografía/métodos , Emociones/fisiología , Lóbulo Frontal , Reconocimiento en Psicología , Reconocimiento Facial/fisiologíaRESUMEN
The human brain rapidly and automatically categorizes faces vs. other visual objects. However, whether face-selective neural activity predicts the subjective experience of a face - perceptual awareness - is debated. To clarify this issue, here we use face pareidolia, i.e., the illusory perception of a face, as a proxy to relate the neural categorization of a variety of facelike objects to conscious face perception. In Experiment 1, scalp electroencephalogram (EEG) is recorded while pictures of human faces or facelike objects - in different stimulation sequences - are interleaved every second (i.e., at 1 Hz) in a rapid 6-Hz train of natural images of nonface objects. Participants do not perform any explicit face categorization task during stimulation, and report whether they perceived illusory faces post-stimulation. A robust categorization response to facelike objects is identified at 1 Hz and harmonics in the EEG frequency spectrum with a facelike occipito-temporal topography. Across all individuals, the facelike categorization response is of about 20% of the response to human faces, but more strongly right-lateralized. Critically, its amplitude is much larger in participants who report having perceived illusory faces. In Experiment 2, facelike or matched nonface objects from the same categories appear at 1 Hz in sequences of nonface objects presented at variable stimulation rates (60 Hz to 12 Hz) and participants explicitly report after each sequence whether they perceived illusory faces. The facelike categorization response already emerges at the shortest stimulus duration (i.e., 17 ms at 60 Hz) and predicts the behavioral report of conscious perception. Strikingly, neural facelike-selectivity emerges exclusively when participants report illusory faces. Collectively, these experiments characterize a neural signature of face pareidolia in the context of rapid categorization, supporting the view that face-selective brain activity reliably predicts the subjective experience of a face from a single glance at a variety of stimuli.
Asunto(s)
Reconocimiento Facial , Ilusiones , Encéfalo/fisiología , Mapeo Encefálico , Electroencefalografía , Reconocimiento Facial/fisiología , Humanos , Estimulación Luminosa/métodosRESUMEN
Emotion regulation develops from the earliest years of a child's life and mostly through visual information. Considering the importance of emotion regulation in daily life situations, it is important to study the effect of visual experience on the development of this ability. This study is the first to examine the effect of visual experience and age in emotion regulation by comparing groups of children with different visual status and age. For this purpose, after testing the reliability and consistency of the French version of Emotion Regulation Checklist (ERC-vf) with 245 parents of blind, visually impaired and sighted children aged 3-5, 6-8 or 9-12 years, we conducted analyses on the effect of visual status and age on emotion regulation composite scores. The first result confirmed that the ERC-vf can be reliably used on populations of blind and visually impaired children. The second result revealed an effect of visual status on ER composite scores of emotion regulation: Blind children and visually impaired children each had significantly lower composite scores than sighted children. Moreover, the effect of age and the interaction between age and visual status were not significant on ER composite scores. The ER subscale results suggest, however, that age may have a variable effect for blind and visually impaired children as blind children's scores become lower and those of visually impaired children become equal to sighted children with age. The results of our study may help the children's entourage to better adapt their interactions in a context of visual impairment.
Asunto(s)
Regulación Emocional , Personas con Daño Visual , Ceguera , Niño , Preescolar , Humanos , Padres , Reproducibilidad de los Resultados , Encuestas y Cuestionarios , Personas con Daño Visual/psicologíaRESUMEN
Infants' ability to discriminate facial expressions has been widely explored, but little is known about the rapid and automatic ability to discriminate a given expression against many others in a single experiment. Here we investigated the development of facial expression discrimination in infancy with fast periodic visual stimulation coupled with scalp electroencephalography (EEG). EEG was recorded in eighteen 3.5- and eighteen 7-month-old infants presented with a female face expressing disgust, happiness, or a neutral emotion (in different stimulation sequences) at a base stimulation frequency of 6 Hz. Pictures of the same individual expressing other emotions (either anger, disgust, fear, happiness, sadness, or neutrality, randomly and excluding the expression presented at the base frequency) were introduced every six stimuli (at 1 Hz). Frequency-domain analysis revealed an objective (i.e., at the predefined 1-Hz frequency and harmonics) expression-change brain response in both 3.5- and 7-month-olds, indicating the visual discrimination of various expressions from disgust, happiness and neutrality from these early ages. At 3.5 months, the responses to the discrimination from disgust and happiness expressions were located mainly on medial occipital sites, whereas a more lateral topography was found for the response to the discrimination from neutrality, suggesting that expression discrimination from an emotionally neutral face relies on distinct visual cues than discrimination from a disgust or happy face. Finally, expression discrimination from happiness was associated with a reduced activity over posterior areas and an additional response over central frontal scalp regions at 7 months as compared to 3.5 months. This result suggests developmental changes in the processing of happiness expressions as compared to negative/neutral ones within this age range.
RESUMEN
Individuals with schizophrenia have difficulties in recognizing facial emotions in others. This study investigated whether this impairment also exists for self-generated expressions. Nineteen patients with schizophrenia and 19 comparison subjects were filmed while producing facial expressions in response to a visual model or a written sentence. After 2 months, all subjects were asked to rate their own emotional expressions. These ratings were compared with the evaluations of 12 healthy independent raters. With respect to the comparison subjects, the patients produced less expressive responses and were less able to recognize their own expressions. Moreover, patients were totally unaware of these impairments.