Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 23
Filtrar
Mais filtros

Bases de dados
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
Cogn Emot ; : 1-10, 2023 Sep 21.
Artigo em Inglês | MEDLINE | ID: mdl-37732611

RESUMO

While the recognition of ambiguous emotions is crucial for successful social interactions, previous work has shown that they are perceived differently depending on whether they are viewed on male or female faces. The present paper aims to shed light on this phenomenon by exploring two hypotheses: the confounded signal hypothesis, which posits the existence of perceptual overlaps between emotions and gendered morphotypes, and the social role hypothesis, according to which the observer's responses are biased by stereotypes. Participants were asked to categorise blended faces (i.e. artificial faces made ambiguous by mixing two emotions) in a forced-choice task. Six emotions were used to create each blend (neutral, surprise, sadness, fear, happiness, anger), for a total of 15 expressions. We then applied signal detection theory - considering both the morphotype of the stimuli and the participants' gender - to distinguish participants' perceptual processes from their response biases. The results showed a perceptual advantage for anger on male faces and for sadness on female faces. However, different strategies were deployed when labelling emotions on gendered morphotypes. In particular, a response bias towards angry male faces establishes their special status, as they resulted in both excellent detection and a tendency to be over-reported, especially by women.

2.
Exp Brain Res ; 238(12): 2877-2886, 2020 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-33057868

RESUMO

During social interactions, perception of emotions affects motor behaviour by triggering responses like freezing or approach and avoidance reactions. It is however difficult to get a clear picture of the relationship between emotion and posture as previous studies showed inconsistent results, due to methodological differences on stimuli and/or the postural measures used. In this study, we thoroughly investigate how the perception of emotions affects postural control and action tendencies, by contrasting two types of stimuli (emotional static faces or emotional videos) expressing different types of basic emotions (happy, fear, angry, sad, disgust and neutral). We also take into account some other contributing factors relying on stable individual traits (e.g., extraversion, neuroticism, conscientiousness, empathy, etc) and emotional state (e.g., anxiety). Our results show that dynamic stimuli have a greater impact than static stimuli on postural control. Moreover, a crucial aspect of our work lay in the modulation of the relationship between emotions and posture, by stable individual traits.


Assuntos
Expressão Facial , Equilíbrio Postural , Ira , Emoções , Extroversão Psicológica , Humanos
3.
Brain Cogn ; 92C: 92-100, 2014 12.
Artigo em Inglês | MEDLINE | ID: mdl-25463143

RESUMO

The relevance of emotional perception in interpersonal relationships and social cognition has been well documented. Although brain diseases might impair emotional processing, studies concerning emotional recognition in patients with brain tumours are relatively rare. The aim of this study was to explore emotional recognition in patients with gliomas in three conditions (visual, auditory and crossmodal) and to analyse how tumour-related variables (notably, tumour localisation) and patient-related variables influence emotion recognition. Twenty six patients with gliomas and 26 matched healthy controls were instructed to identify 5 basic emotions and a neutral expression, which were displayed through visual, auditory and crossmodal stimuli. Relative to the controls, recognition was weakly impaired in the patient group under both visual and auditory conditions, but the performances were comparable in the crossmodal condition. Additional analyses using the 'race model' suggest differences in multisensory emotional integration abilities across the groups, which were potentially correlated with the executive disorders observed in the patients. These observations support the view of compensatory mechanisms in the case of gliomas that might preserve the quality of life and help maintain the normal social and professional lives often observed in these patients.

4.
Sci Rep ; 14(1): 15320, 2024 07 03.
Artigo em Inglês | MEDLINE | ID: mdl-38961132

RESUMO

Age-related changes in emotional processing are complex, with a bias toward positive information. However, the impact of aging on emotional responses in positive everyday situations remains unclear. Virtual Reality (VR) has emerged as a promising tool for investigating emotional processing, offering a unique balance between ecological validity and experimental control. Yet, limited evidence exists regarding its efficacy to elicit positive emotions in older adults. Our study aimed to explore age-related differences in positive emotional responses to immersion in both social and nonsocial virtual emotional environments. We exposed 34 younger adults and 24 older adults to natural and social 360-degree video content through a low immersive computer screen and a highly immersive Head-Mounted Display, while recording participants' physiological reactions. Participants also provided self-report of their emotions and sense of presence. The findings support VR's efficacy in eliciting positive emotions in both younger and older adults, with age-related differences in emotional responses influenced by the specific video content rather than immersion level. These findings underscore the potential of VR as a valuable tool for examining age-related differences in emotional responses and developing VR applications to enhance emotional wellbeing across diverse user populations.


Assuntos
Envelhecimento , Emoções , Realidade Virtual , Humanos , Emoções/fisiologia , Masculino , Feminino , Idoso , Adulto , Adulto Jovem , Pessoa de Meia-Idade , Envelhecimento/fisiologia , Envelhecimento/psicologia , Fatores Etários
5.
PLoS One ; 19(2): e0298069, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38306322

RESUMO

Understanding the influence of emotions on social interactions is important for a global understanding of the dynamics of human behavior. In this study, we investigated the interplay between emotions, spontaneous approach or avoidance tendencies, and the regulation of interpersonal distance. Fifty-seven healthy adults participated in a three-part experiment involving exposure to approaching or withdrawing emotional faces (neutral, happy, sad, fearful, disgusted, angry). The sequence began with an initial computerized stop-distance task, followed by a postural task in which participants' approach or avoidance tendencies were quantified via center of pressure (CoP-Y) displacements on a force platform, and concluded with a final computerized stop-distance task. Our findings revealed a gradient in postural responses, with the most forward CoP-Y displacements for neutral and happy faces, indicative of approach tendencies. These were followed by lesser forward displacements for sad and fearful faces, and most pronounced backward displacements for disgusted and angry faces, indicating avoidance. Furthermore, we observed modulations in participants' preferred interpersonal distance based on emotional cues, with neutral and happy faces associated with shorter distances, and disgusted and angry faces linked to larger distances. Despite these similar results, no direct correlation was found between CoP-Y and preferred interpersonal distance, underscoring a dissociation between spontaneous and voluntary social behaviors. These results contribute to a better understanding of how emotional expressions shape social interactions and underscore the importance of considering emotional cues, postural action tendencies, and interpersonal distance in facilitating successful social interactions.


Assuntos
Ira , Emoções , Adulto , Humanos , Emoções/fisiologia , Felicidade , Comportamento Social , Medo , Expressão Facial
6.
Neurocase ; 19(3): 302-12, 2013.
Artigo em Inglês | MEDLINE | ID: mdl-22554225

RESUMO

The present study aimed to analyze the multimodal skills that would be spared, altered, or impaired by gliomas that slowly infiltrate various and diversely localized areas in the cerebral hemispheres. Ten patients and 60 healthy controls were evaluated using four multimodal processing paradigms across 11 tasks. Our objectives were as follows: (a) to describe the strengths and weaknesses of the glioma patients' multimodal processing performance after accounting for task specificity and their individual performances compared to those of the control group; (b) to determine the correlation between lesion localization and impairments; and (c) to identify the tasks that were most sensitive to tumor infiltration and plasticity limits. Our results show that patients as a whole were efficient at most tasks; however, the patients exhibited difficulties in the productive picture-naming task, the receptive verbal judgment task, and the visual/graphic portion of the dual-attention task. The individual case reports show that the difficulties were distributed across the patients and did not correlate with lesion localization and tumor type.


Assuntos
Neoplasias Encefálicas/fisiopatologia , Glioma/fisiopatologia , Adulto , Atenção , Terapia por Estimulação Elétrica , Feminino , Glioma/terapia , Humanos , Masculino , Testes Neuropsicológicos , Neurocirurgia , Estimulação Luminosa , Tempo de Reação , Índice de Gravidade de Doença , Estatísticas não Paramétricas , Adulto Jovem
7.
Cyberpsychol Behav Soc Netw ; 26(4): 238-245, 2023 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-37001171

RESUMO

Immersive technologies, such as Virtual Reality (VR), have great potential for enhancing users' emotions and wellbeing. However, how immersion, Virtual Environment contents, and sense of presence (SoP) influence emotional responses remains to be clarified to efficiently foster positive emotions. Consequently, a total of 26 participants (16 women, 10 men, 22.73 ± 2.69 years old) were exposed to 360-degree videos of natural and social contents on both a highly immersive Head-Mounted Display and a low immersive computer screen. Subjective emotional responses and SoP were assessed after each video using self-reports, while a wearable wristband collected continuously electrodermal activity and heart rate to record physiological emotional responses. Findings supported the added value of immersion, as more positive emotions and greater subjective arousal were reported after viewing the videos in the highly immersive setting, regardless of the video contents. In addition to usually employed natural contents, the findings also provide initial evidence for the effectiveness of social contents in eliciting positive emotions. Finally, structural equation models shed light on the indirect effect of immersion, through spatial and spatial SoP on subjective arousal. Overall, these are encouraging results about the effectiveness of VR for fostering positive emotions. Future studies should further investigate the influence of user characteristics on VR experiences to foster efficiently positive emotions among a broad range of potential users.


Assuntos
Imersão , Realidade Virtual , Masculino , Humanos , Feminino , Adulto Jovem , Adulto , Emoções/fisiologia , Felicidade , Modelos Teóricos
8.
Q J Exp Psychol (Hove) ; 74(6): 1128-1139, 2021 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-33283649

RESUMO

Previous research has highlighted age-related differences in social perception, in particular emotional expression processing. To date, such studies have largely focused on approaches that use static emotional stimuli that the participant has to identify passively without the possibility of any interaction. In this study, we propose an interactive virtual environment to better address age-related variations in social and emotional perception. A group of 22 young (18-30 years) and 20 older (60-80 years) adults were engaged in a face-to-face conversation with an embodied conversational agent. Participants were invited to interact naturally with the agent and to identify his facial expression. Their gaze behaviour was captured by an eye-tracking device throughout the interaction. We also explored whether the Big Five personality traits (particularly extraversion) and anxiety modulated gaze during the social interaction. Findings suggested that age-related differences in gaze behaviour were only apparent when decoding social signals (i.e., listening to a partner's question, identifying facial expressions) and not when communicating social information (i.e., when speaking). Furthermore, higher extraversion levels consistently led to a shorter amount of time gazing towards the eyes, whereas higher anxiety levels led to slight modulations of gaze only when participants were listening to questions. Face-to-face conversation with virtual agents can provide a more naturalistic framework for the assessment of online socio-emotional interaction in older adults, which is not easily observable in classical offline paradigms. This study provides novel and important insights into the specific circumstances in which older adults may experience difficulties in social interactions.


Assuntos
Tecnologia de Rastreamento Ocular , Expressão Facial , Idoso , Emoções , Movimentos Oculares , Humanos , Percepção Social
9.
Front Psychol ; 12: 730953, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-35002834

RESUMO

In everyday life, interactions between humans are generally modulated by the value attributed to the situation, which partly relies on the partner's behavior. A pleasant or cooperating partner may trigger an approach behavior in the observer, while an unpleasant or threatening partner may trigger an avoidance behavior. In this context, the correct interpretation of other's intentions is crucial to achieve satisfying social interactions. Social cues such as gaze direction and facial expression are both fundamental and interrelated. Typically, whenever gaze direction and facial expression of others communicate the same intention, it enhances both the interlocutor's gaze direction and the perception of facial expressions (i.e., shared signal hypothesis). For instance, an angry face with a direct gaze is perceived as more intense since it represents a threat to the observer. In this study, we propose to examine how the combination of others' gaze direction (direct or deviated) and emotional facial expressions (i.e., happiness, fear, anger, sadness, disgust, and neutrality) influence the observer's gaze perception and postural control. Gaze perception was indexed by the cone of direct gaze (CoDG) referring to the width over which an observer feels someone's gaze is directed at them. A wider CoDG indicates that the observer perceived the face as looking at them over a wider range of gaze directions. Conversely, a narrower CoDG indicates a decrease in the range of gaze directions perceived as direct. Postural control was examined through the center of pressure displacements reflecting postural stability and approach-avoidance tendencies. We also investigated how both gaze perception and postural control may vary according to participants' personality traits and emotional states (e.g., openness, anxiety, etc.). Our results confirmed that gaze perception is influenced by emotional faces: a wider CoDGs was observed with angry and disgusted faces while a narrower CoDG was observed for fearful faces. Furthermore, facial expressions combined with gaze direction influence participants' postural stability but not approach-avoidance behaviors. Results are discussed in the light of the approach-avoidance model, by considering how some personality traits modulate the relation between emotion and posture.

10.
Mol Autism ; 11(1): 5, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-31956394

RESUMO

Background: Computer vision combined with human annotation could offer a novel method for exploring facial expression (FE) dynamics in children with autism spectrum disorder (ASD). Methods: We recruited 157 children with typical development (TD) and 36 children with ASD in Paris and Nice to perform two experimental tasks to produce FEs with emotional valence. FEs were explored by judging ratings and by random forest (RF) classifiers. To do so, we located a set of 49 facial landmarks in the task videos, we generated a set of geometric and appearance features and we used RF classifiers to explore how children with ASD differed from TD children when producing FEs. Results: Using multivariate models including other factors known to predict FEs (age, gender, intellectual quotient, emotion subtype, cultural background), ratings from expert raters showed that children with ASD had more difficulty producing FEs than TD children. In addition, when we explored how RF classifiers performed, we found that classification tasks, except for those for sadness, were highly accurate and that RF classifiers needed more facial landmarks to achieve the best classification for children with ASD. Confusion matrices showed that when RF classifiers were tested in children with ASD, anger was often confounded with happiness. Limitations: The sample size of the group of children with ASD was lower than that of the group of TD children. By using several control calculations, we tried to compensate for this limitation. Conclusion: Children with ASD have more difficulty producing socially meaningful FEs. The computer vision methods we used to explore FE dynamics also highlight that the production of FEs in children with ASD carries more ambiguity.


Assuntos
Transtorno do Espectro Autista/psicologia , Expressão Facial , Criança , Emoções , Feminino , Humanos , Masculino
11.
Psychol Neuropsychiatr Vieil ; 7(1): 31-42, 2009 Mar.
Artigo em Francês | MEDLINE | ID: mdl-19251570

RESUMO

The ability to recognize facial identity and emotional facial expression is central to social relationships. This paper reviews studies concerning face recognition and emotional facial expression during normal aging as well as in neurodegenerative diseases occurring in the elderly. It focuses on Alzheimer's disease, frontotemporal and semantic dementia, and also Parkinson's disease. The results of studies on healthy elderly individuals show subtle alterations in the recognition of facial identity and emotional facial expression from the age of 50 years, and increasing after 70. Studies in neurodegenerative diseases show that - during their initial stages - face recognition and facial expression can be specifically affected. Little has been done to assess these difficulties in clinical practice. They could constitute a useful marker for differential diagnosis, especially for the clinical differentiation of Alzheimer's disease (AD) from frontotemporal dementia (FTD). Social difficulties and some behavioural problems observed in these patients may, at least partly, result from these deficits in face processing. Thus, it is important to specify the possible underlying anatomofunctional substrates of these deficits as well as to plan suitable remediation programs.


Assuntos
Envelhecimento/psicologia , Emoções Manifestas/fisiologia , Face , Expressão Facial , Doenças Neurodegenerativas/psicologia , Reconhecimento Psicológico/fisiologia , Idoso , Doença de Alzheimer/psicologia , Demência/psicologia , Humanos , Pessoa de Meia-Idade
12.
Front Psychol ; 9: 446, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-29670561

RESUMO

The production of facial expressions (FEs) is an important skill that allows children to share and adapt emotions with their relatives and peers during social interactions. These skills are impaired in children with Autism Spectrum Disorder. However, the way in which typical children develop and master their production of FEs has still not been clearly assessed. This study aimed to explore factors that could influence the production of FEs in childhood such as age, gender, emotion subtype (sadness, anger, joy, and neutral), elicitation task (on request, imitation), area of recruitment (French Riviera and Parisian) and emotion multimodality. A total of one hundred fifty-seven children aged 6-11 years were enrolled in Nice and Paris, France. We asked them to produce FEs in two different tasks: imitation with an avatar model and production on request without a model. Results from a multivariate analysis revealed that: (1) children performed better with age. (2) Positive emotions were easier to produce than negative emotions. (3) Children produced better FE on request (as opposed to imitation); and (4) Riviera children performed better than Parisian children suggesting regional influences on emotion production. We conclude that facial emotion production is a complex developmental process influenced by several factors that needs to be acknowledged in future research.

13.
Front Psychol ; 8: 548, 2017.
Artigo em Inglês | MEDLINE | ID: mdl-28450841

RESUMO

The identification of non-verbal emotional signals, and especially of facial expressions, is essential for successful social communication among humans. Previous research has reported an age-related decline in facial emotion identification, and argued for socio-emotional or aging-brain model explanations. However, more perceptual differences in the gaze strategies that accompany facial emotional processing with advancing age have been under-explored yet. In this study, 22 young (22.2 years) and 22 older (70.4 years) adults were instructed to look at basic facial expressions while their gaze movements were recorded by an eye-tracker. Participants were then asked to identify each emotion, and the unbiased hit rate was applied as performance measure. Gaze data were first analyzed using traditional measures of fixations over two preferential regions of the face (upper and lower areas) for each emotion. Then, to better capture core gaze changes with advancing age, spatio-temporal gaze behaviors were deeper examined using data-driven analysis (dimension reduction, clustering). Results first confirmed that older adults performed worse than younger adults at identifying facial expressions, except for "joy" and "disgust," and this was accompanied by a gaze preference toward the lower-face. Interestingly, this phenomenon was maintained during the whole time course of stimulus presentation. More importantly, trials corresponding to older adults were more tightly clustered, suggesting that the gaze behavior patterns of older adults are more consistent than those of younger adults. This study demonstrates that, confronted to emotional faces, younger and older adults do not prioritize or ignore the same facial areas. Older adults mainly adopted a focused-gaze strategy, consisting in focusing only on the lower part of the face throughout the whole stimuli display time. This consistency may constitute a robust and distinctive "social signature" of emotional identification in aging. Younger adults, however, were more dispersed in terms of gaze behavior and used a more exploratory-gaze strategy, consisting in repeatedly visiting both facial areas.

14.
J Physiol Paris ; 110(4 Pt B): 420-426, 2016 11.
Artigo em Inglês | MEDLINE | ID: mdl-28625683

RESUMO

Imitation plays a critical role in the development of intersubjectivity and serves as a prerequisite for understanding the emotions and intentions of others. In our review, we consider spontaneous motor imitation between children and their peers as a developmental process involving repetition and perspective-taking as well as flexibility and reciprocity. During childhood, this playful dynamic challenges developing visuospatial abilities and requires temporal coordination between partners. As such, we address synchrony as form of communication and social signal per se, that leads, from an experience of similarity, to the interconnection of minds. In this way, we argue that, from a developmental perspective, rhythmic interpersonal coordination through childhood imitative interactions serves as a precursor to higher- level social and cognitive abilities, such as theory of mind (TOM) and empathy. Finally, to clinically illustrate our idea, we focus on developmental coordination disorder (DCD), a condition characterized not only by learning difficulties, but also childhood deficits in motor imitation. We address the challenges faced by these children on an emotional and socio-interactional level through the perspective of their impairments in intra- and interpersonal synchrony.


Assuntos
Desenvolvimento Infantil/fisiologia , Empatia/fisiologia , Comportamento Imitativo/fisiologia , Relações Interpessoais , Periodicidade , Criança , Humanos , Grupo Associado
15.
Brain Res Cogn Brain Res ; 24(3): 663-73, 2005 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-15890502

RESUMO

We investigated the ERP correlates of the subjective perception of upright and upside-down ambiguous pictures as faces using two-tone Mooney stimuli in an explicit facial decision task (deciding whether a face is perceived or not in the display). The difficulty in perceiving upside-down Mooneys as faces was reflected by both lower rates of "Face" responses and delayed "Face" reaction times for upside-down relative to upright stimuli. The N170 was larger for the stimuli reported as "faces". It was also larger for the upright than the upside-down stimuli only when they were reported as faces. Furthermore, facial decision as well as stimulus orientation effects spread from 140-190 ms to 390-440 ms. The behavioural delay in 'Face' responses to upside-down stimuli was reflected in ERPs by later effect of facial decision for upside-down relative to upright Mooneys over occipito-temporal electrodes. Moreover, an orientation effect was observed only for the stimuli reported as faces; it yielded a marked hemispheric asymmetry, lasting from 140-190 ms to 390-440 ms post-stimulus onset in the left hemisphere and from 340-390 to 390-440 ms only in the right hemisphere. Taken together, the results supported a preferential involvement of the right hemisphere in the detection of faces, whatever their orientation. By contrast, the early orientation effect in the left hemisphere suggested that upside-down Mooney stimuli were processed as non face objects until facial decision was reached in this hemisphere. The present data show that face perception involves not only spatially but also temporally distributed activities in occipito-temporal regions.


Assuntos
Face , Orientação/fisiologia , Percepção Visual/fisiologia , Adulto , Sinais (Psicologia) , Interpretação Estatística de Dados , Tomada de Decisões/fisiologia , Eletroencefalografia , Eletrofisiologia , Potenciais Evocados/fisiologia , Potenciais Evocados Visuais/fisiologia , Feminino , Lateralidade Funcional/fisiologia , Humanos , Masculino , Lobo Occipital/fisiologia , Estimulação Luminosa , Tempo de Reação/fisiologia , Lobo Temporal/fisiologia
16.
Front Psychol ; 6: 691, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-26074845

RESUMO

Social interactions in daily life necessitate the integration of social signals from different sensory modalities. In the aging literature, it is well established that the recognition of emotion in facial expressions declines with advancing age, and this also occurs with vocal expressions. By contrast, crossmodal integration processing in healthy aging individuals is less documented. Here, we investigated the age-related effects on emotion recognition when faces and voices were presented alone or simultaneously, allowing for crossmodal integration. In this study, 31 young adults (M = 25.8 years) and 31 older adults (M = 67.2 years) were instructed to identify several basic emotions (happiness, sadness, anger, fear, disgust) and a neutral expression, which were displayed as visual (facial expressions), auditory (non-verbal affective vocalizations) or crossmodal (simultaneous, congruent facial and vocal affective expressions) stimuli. The results showed that older adults performed slower and worse than younger adults at recognizing negative emotions from isolated faces and voices. In the crossmodal condition, although slower, older adults were as accurate as younger except for anger. Importantly, additional analyses using the "race model" demonstrate that older adults benefited to the same extent as younger adults from the combination of facial and vocal emotional stimuli. These results help explain some conflicting results in the literature and may clarify emotional abilities related to daily life that are partially spared among older adults.

17.
Front Psychol ; 6: 1954, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-26733928

RESUMO

Although deficits in emotion recognition have been widely reported in autism spectrum disorder (ASD), experiments have been restricted to either facial or vocal expressions. Here, we explored multimodal emotion processing in children with ASD (N = 19) and with typical development (TD, N = 19), considering uni (faces and voices) and multimodal (faces/voices simultaneously) stimuli and developmental comorbidities (neuro-visual, language and motor impairments). Compared to TD controls, children with ASD had rather high and heterogeneous emotion recognition scores but showed also several significant differences: lower emotion recognition scores for visual stimuli, for neutral emotion, and a greater number of saccades during visual task. Multivariate analyses showed that: (1) the difficulties they experienced with visual stimuli were partially alleviated with multimodal stimuli. (2) Developmental age was significantly associated with emotion recognition in TD children, whereas it was the case only for the multimodal task in children with ASD. (3) Language impairments tended to be associated with emotion recognition scores of ASD children in the auditory modality. Conversely, in the visual or bimodal (visuo-auditory) tasks, the impact of developmental coordination disorder or neuro-visual impairments was not found. We conclude that impaired emotion processing constitutes a dimension to explore in the field of ASD, as research has the potential to define more homogeneous subgroups and tailored interventions. However, it is clear that developmental age, the nature of the stimuli, and other developmental comorbidities must also be taken into account when studying this dimension.

18.
Schizophr Res ; 168(1-2): 252-9, 2015 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-26297473

RESUMO

Recognition of emotional expressions plays an essential role in children's healthy development. Anomalies in these skills may result in empathy deficits, social interaction difficulties and premorbid emotional problems in children and adolescents with schizophrenia. Twenty-six subjects with early onset schizophrenia spectrum (EOSS) disorders and twenty-eight matched healthy controls (HC) were instructed to identify five basic emotions and a neutral expression. The assessment entailed presenting visual, auditory and congruent cross-modal stimuli. Using a generalized linear mixed model, we found no significant association for handedness, age or gender. However, significant associations emerged for emotion type, perception modality, and group. EOSS patients performed worse than HC in uni- and cross-modal emotional tasks with a specific negative emotion processing impairment pattern. There was no relationship between emotion identification scores and positive or negative symptoms, self-reported empathy traits or a positive history of developmental disorders. However, we found a significant association between emotional identification scores and nonverbal communication impairments. We conclude that cumulative dysfunctions in both nonverbal communication and emotion processing contribute to the social vulnerability and morbidity found in youths who display EOSS disorder.


Assuntos
Emoções , Reconhecimento Facial , Transtornos Psicóticos/psicologia , Psicologia do Esquizofrênico , Percepção da Fala , Adolescente , Idade de Início , Criança , Expressão Facial , Feminino , Humanos , Modelos Lineares , Masculino , Escalas de Graduação Psiquiátrica , Testes Psicológicos , Transtornos Psicóticos/tratamento farmacológico , Reconhecimento Psicológico , Esquizofrenia/tratamento farmacológico , Percepção Social
19.
Geriatr Psychol Neuropsychiatr Vieil ; 13(1): 106-15, 2015 Mar.
Artigo em Francês | MEDLINE | ID: mdl-25786430

RESUMO

Patients with Alzheimer's disease (AD) show cognitive and behavioral disorders, which they and their caregivers have difficulties to cope with in daily life. Psychological symptoms seem to be increased by impaired emotion processing in patients, this ability being linked to social cognition and thus essential to maintain good interpersonal relationships. Non-verbal emotion processing is a genuine way to communicate, especially so for patients whose language may be rapidly impaired. Many studies focus on emotion identification in AD patients, mostly by means of facial expressions rather than emotional prosody; even fewer consider emotional prosody production, despite its playing a key role in interpersonal exchanges. The literature on this subject is scarce with contradictory results. The present study compares the performances of 14 AD patients (88.4±4.9 yrs; MMSE: 19.9±2.7) to those of 14 control subjects (87.5±5.1 yrs; MMSE: 28.1±1.4) in tasks of emotion identification through faces and voices (non linguistic vocal emotion or emotional prosody) and in a task of emotional prosody production (12 sentences were to be pronounced in a neutral, positive, or negative tone, after a context was read). The Alzheimer's disease patients showed weaker performances than control subjects in all emotional recognition tasks and particularly when identifying emotional prosody. A negative relation between the identification scores and the NPI (professional caregivers) scores was found which underlines their link to psychological and behavioral disorders. The production of emotional prosody seems relatively preserved in a mild to moderate stage of the disease: we found subtle differences regarding acoustic parameters but in a qualitative way judges established that the patients' productions were as good as those of control subjects. These results suggest interesting new directions for improving patients' care.


Assuntos
Doença de Alzheimer/psicologia , Emoções , Idoso , Idoso de 80 Anos ou mais , Progressão da Doença , Expressão Facial , Feminino , Humanos , Masculino , Testes Neuropsicológicos , Comunicação não Verbal , Reconhecimento Psicológico
20.
Neurosci Lett ; 349(2): 125-9, 2003 Oct 02.
Artigo em Inglês | MEDLINE | ID: mdl-12946568

RESUMO

Midlife period has not been investigated so far regarding associations between brain responses and spared abilities for face processing. This study examines the effects of midlife aging on behavioural performance and event-related potentials (ERPs) during the perception of personally known faces. Ten middle-aged adults (aged 45-60) and 12 young adults (aged 20-30) performed a visual discrimination task based on the detection of modified eye colours. We found that this task was performed as accurately by middle-aged as by young adults. However, midlife aging is associated with specific ERP latency delays and important changes in scalp ERP distribution. These results -interpreted according to a compensation hypothesis- provide enlightening indications showing that, compared to young adults, the changes in brain activities observed in middle-aged adults may contribute to their maintained behavioural performance.


Assuntos
Envelhecimento , Mapeamento Encefálico , Encéfalo/fisiologia , Potenciais Evocados Visuais/fisiologia , Reconhecimento Visual de Modelos/fisiologia , Adulto , Comportamento/fisiologia , Discriminação Psicológica/fisiologia , Eletroencefalografia , Face , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Tempo de Reação
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA