Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 37
Filter
1.
Cardiol Young ; 34(3): 628-633, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37681464

ABSTRACT

BACKGROUND: Warfarin remains the preferred anticoagulant for many patients with CHD. The complexity of management led our centre to shift from a nurse-physician-managed model with many providers to a pharmacist-managed model with a centralized anticoagulation team. We aim to describe the patient cohort managed by our Anticoagulation Program and evaluate the impact of implementation of this consistent, pharmacist-managed model on time in therapeutic range, an evidence-based marker for clinical outcomes. METHODS: A single-centre retrospective cohort study was conducted to evaluate the impact of the transition to a pharmacist-managed model to improve anticoagulation management at a tertiary pediatric heart centre. The percent time in therapeutic range for a cohort managed by both models was compared using a paired t-test. Patient characteristics and time in therapeutic range of the program were also described. RESULTS: After implementing the pharmacist-managed model, the time in therapeutic range for a cohort of 58 patients increased from 65.7 to 80.2% (p < .001), and our Anticoagulation Program consistently maintained this improvement from 2013 to 2022. The cohort of patients managed by the Anticoagulation Program in 2022 included 119 patients with a median age of 24 years (range 19 months-69 years) with the most common indication for warfarin being mechanical valve replacement (n = 81, 68%). CONCLUSIONS: Through a practice change incorporating a collaborative, centralized, pharmacist-managed model, this cohort of CHD patients on warfarin had a fifteen percent increase in time in therapeutic range, which was sustained for nine years.


Subject(s)
Heart Defects, Congenital , Pharmacists , Child , Humans , Infant , Retrospective Studies , Warfarin/therapeutic use , Heart Defects, Congenital/complications , Heart Defects, Congenital/drug therapy , Anticoagulants/therapeutic use
2.
Psychol Health ; : 1-21, 2023 Apr 05.
Article in English | MEDLINE | ID: mdl-37017223

ABSTRACT

OBJECTIVE: Moving overseas to study can be exciting, however many international students find this transition stressful. Therefore, empirically supported strategies to assist with managing stress and supporting well-being are needed. Motivated music listening is an effective stress management strategy, and is linked with international student well-being. Tuned In is a group program designed to increase emotion awareness and regulation using motivated music listening. METHODS AND MEASURES: We evaluated a 4-session online version of Tuned In for motivated music use, emotion regulation, and well-being in international students. The study used a 2 (Treatment; Waitlist) x 3 (timepoints: pre = T1; +4 weeks = T2; +8 weeks = T3) randomised controlled cross-over design. Treatment participants (n = 23) completed Tuned In between T1 and T2, Waitlist participants (n = 27) completed Tuned In between T2 and T3. RESULTS: Between T1 and T2, motivated music use increased in Treatment participants but not for the Waitlist. Treatment participants were also more confident in maintaining happiness and in having healthy ways of managing emotions at T2. All participants enjoyed Tuned In. CONCLUSIONS: Tuned In, a group-based music listening program, even when delivered online, provides benefits for international students. With student well-being at risk as they begin university, enjoyable programs that help develop skills for students' academic journey should be a priority.

3.
Sci Rep ; 12(1): 5911, 2022 04 08.
Article in English | MEDLINE | ID: mdl-35396450

ABSTRACT

Human visual systems have evolved to extract ecologically relevant information from complex scenery. In some cases, the face in the crowd visual search task demonstrates an anger superiority effect, where anger is allocated preferential attention. Across three studies (N = 419), we tested whether facial hair guides attention in visual search and influences the speed of detecting angry and happy facial expressions in large arrays of faces. In Study 1, participants were faster to search through clean-shaven crowds and detect bearded targets than to search through bearded crowds and detect clean-shaven targets. In Study 2, targets were angry and happy faces presented in neutral backgrounds. Facial hair of the target faces was also manipulated. An anger superiority effect emerged that was augmented by the presence of facial hair, which was due to the slower detection of happiness on bearded faces. In Study 3, targets were happy and angry faces presented in either bearded or clean-shaven backgrounds. Facial hair of the background faces was also systematically manipulated. A significant anger superiority effect was revealed, although this was not moderated by the target's facial hair. Rather, the anger superiority effect was larger in clean-shaven than bearded face backgrounds. Together, results suggest that facial hair does influence detection of emotional expressions in visual search, however, rather than facilitating an anger superiority effect as a potential threat detection system, facial hair may reduce detection of happy faces within the face in the crowd paradigm.


Subject(s)
Facial Expression , Happiness , Anger , Emotions , Face , Hair , Humans , Reaction Time
4.
J Child Lang ; 49(3): 503-521, 2022 05.
Article in English | MEDLINE | ID: mdl-33722310

ABSTRACT

Emotion can influence various cognitive processes. Communication with children often involves exaggerated emotional expressions and emotive language. Children with autism spectrum disorder often show a reduced tendency to attend to emotional information. Typically developing children aged 7 to 9 years who varied in their level of autism-like traits learned the nonsense word names of nine novel toys, which were presented with either happy, fearful, or neutral emotional cues. Emotional cues had no influence on word recognition or recall performance. Eye-tracking data showed differences in visual attention depending on the type of emotional cues and level of autism-like traits. The findings suggest that the influence of emotion on attention during word learning differs according to whether the children have lower or higher levels of autism-like traits, but this influence does not affect word learning outcomes.


Subject(s)
Autism Spectrum Disorder , Autistic Disorder , Autism Spectrum Disorder/psychology , Child , Cues , Emotions , Humans , Language Development
5.
J Exp Child Psychol ; 209: 105184, 2021 09.
Article in English | MEDLINE | ID: mdl-34051681

ABSTRACT

Research using posed emotional expressions is problematic because they lack ecological validity. Adults' recognition of spontaneous real-world expressions may require the inclusion of postural information. Whether posture improves children's recognition of real-world expressions was unknown. Younger children (n = 30; 5- to 7-year-olds), older children (n = 30; 8- to 10-year-olds), and adults (n = 30) judged whether tennis players had won or lost a point. Images showed one of three cue types: Head-only, Body-only, or Head-Body expressions. Recognition of expressions improved with age; older children and adults performed better than younger children. In addition, recognition of Body-only and Head-Body cues was better than Head-only cues for all ages. Spontaneous expression recognition improved throughout childhood and with the inclusion of postural information.


Subject(s)
Emotions , Facial Expression , Adolescent , Adult , Child , Cues , Humans , Posture , Recognition, Psychology
6.
Front Psychol ; 12: 647065, 2021.
Article in English | MEDLINE | ID: mdl-33868120

ABSTRACT

The COVID-19 pandemic brought rapid changes to travel, learning environments, work conditions, and social support, which caused stress for many University students. Research with young people has revealed music listening to be among their most effective strategies for coping with stress. As such, this survey of 402 first-year Australian University students (73.9% female, M age = 19.6; 75% domestic and 25% international) examined the effectiveness of music listening during COVID-19 compared with other stress management strategies, whether music listening for stress management was related to well-being, and whether differences emerged between domestic and international students. We also asked participants to nominate a song that helped them to cope with COVID-19 stress and analyzed its features. Music listening was among the most effective stress coping strategies, and was as effective as exercise, sleep, and changing location. Effectiveness of music listening as a coping strategy was related to better well-being but not to level of COVID-19 related stress. Although international students experienced higher levels of COVID-19 stress than domestic students, well-being was comparable in the two cohorts. Nominated songs tended to be negative in valence and moderate in energy. No correlations were found between any self-report measure and the valence and energy of nominated coping songs. These findings suggest that although domestic and international students experienced different levels of stress resulting from COVID-19, music listening remained an effective strategy for both cohorts, regardless of the type of music they used for coping.

7.
J Exp Child Psychol ; 201: 104969, 2021 01.
Article in English | MEDLINE | ID: mdl-32916594

ABSTRACT

Overclaiming is the phenomenon whereby people claim more knowledge of a topic than they actually have. In adults, this behavior is related to the extent to which they consider themselves an expert on that topic and may be related to impression management. We investigated the emergence of this phenomenon by developing a child-friendly overclaiming questionnaire (OCQ)-the Child-OCQ. We measured the tendency of children (5-10 years of age old; N = 94) to claim knowledge of items that did not exist for a variety of topics (places, characters, animals, food, and musical instruments). We also examined the relationship between children's overclaiming of knowledge and their self-perceived liking of, and expertise in, the topics. To validate our scale, an adult sample (N = 51) completed both the Child-OCQ and a standardized adult OCQ, the OCQ-150, showing similar overclaiming patterns on both measures. Although overclaiming behaviors decreased throughout childhood, even children as old as 10 years were not adult-like and were more likely to overclaim knowledge than adults. In addition, we did not find strong evidence that children's perceived expertise on a topic influenced their tendency to overclaim knowledge, suggesting that the mechanisms behind the overclaiming phenomenon are different in children and do not reflect impression management until later during adolescence or adulthood.


Subject(s)
Emotions , Knowledge , Child , Female , Humans , Male , Surveys and Questionnaires , Young Adult
8.
PLoS One ; 15(7): e0235390, 2020.
Article in English | MEDLINE | ID: mdl-32609780

ABSTRACT

Whether language information influences recognition of emotion from facial expressions remains the subject of debate. The current studies investigate how variations in emotion labels that are paired with expressions influences participants' judgments of the emotion displayed. Static (Study 1) and dynamic (Study 2) facial expressions depicting eight emotion categories were paired with emotion labels that systematically varied in arousal (low and high). Participants rated the arousal, valence, and dominance of expressions paired with labels. Isolated faces and isolated labels were also rated. As predicted, the label presented influenced participants' judgments of the expressions. Across both studies, higher arousal labels were associated with: 1) higher ratings of arousal for sad, angry, and scared expressions, and 2) higher ratings of dominance for angry, proud, and disgust expressions. These results indicate that emotion labels influence judgments of facial expressions.


Subject(s)
Arousal , Facial Expression , Judgment , Pattern Recognition, Visual , Rage , Adult , Attentional Bias , Female , Humans , Male , Middle Aged , Young Adult
9.
J Exp Child Psychol ; 198: 104879, 2020 10.
Article in English | MEDLINE | ID: mdl-32590198

ABSTRACT

Research examining children's emotion judgments has generally used nonsocial tasks that do not resemble children's daily experiences in judging others' emotions. Here, younger children (4- to 6-year-olds) and older children (7- to 9-year-olds) participated in a socially interactive task where an experimenter opened boxes and made an expression (happy, sad, scared, or disgust) based on the object inside. Children guessed which of four objects (a sticker, a broken toy car, a spider, or toy poop) was in the box. Subsequently, children opened a set of boxes and generated facial expressions for the experimenter. Children also labeled the emotion elicited by the objects and static facial expressions. Children's ability to guess which object caused the experimenter's expression increased with age but did not predict their ability to generate a recognizable expression. Children's demonstration of emotion knowledge also varied across tasks, suggesting that when emotion judgment tasks more closely mimic their daily experiences, children demonstrate broader emotion knowledge.


Subject(s)
Child Development , Emotions , Facial Expression , Recognition, Psychology , Adolescent , Child , Child Development/physiology , Child, Preschool , Emotions/physiology , Female , Humans , Male , Recognition, Psychology/physiology
10.
Cogn Emot ; 34(5): 906-919, 2020 08.
Article in English | MEDLINE | ID: mdl-31805815

ABSTRACT

Previous research on the development of emotion recognition in music has focused on classical, rather than popular music. Such research does not consider the impact of lyrics on judgements of emotion in music, impact that may differ throughout development. We had 172 children, adolescents, and adults (7- to 20-year-olds) judge emotions in popular music. In song excerpts, the melody of the music and the lyrics had either congruent valence (e.g. happy lyrics and melody), or incongruent valence (e.g. scared lyrics, happy melody). We also examined participants' judgements of vocal bursts, and whether emotion identification was linked to emotion lexicon. Recognition of emotions in congruent music increased with age. For incongruent music, age was positively associated with judging the emotion in music by the melody. For incongruent music with happy or sad lyrics, younger participants were more likely to answer with the emotion of the lyrics. For scared incongruent music, older adolescents were more likely to answer with the lyrics than older and younger participants. Age groups did not differ on their emotion lexicons, nor recognition of emotion in vocal bursts. Whether children use lyrics or melody to determine the emotion of popular music may depend on the emotion conveyed.


Subject(s)
Auditory Perception/physiology , Emotions/physiology , Music/psychology , Recognition, Psychology/physiology , Singing/physiology , Adolescent , Adult , Age Factors , Child , Cues , Female , Humans , Judgment/physiology , Male , Young Adult
11.
J Exp Child Psychol ; 191: 104737, 2020 03.
Article in English | MEDLINE | ID: mdl-31783253

ABSTRACT

The ability to explicitly recognize emotions develops gradually throughout childhood, and children usually have greater difficulty in recognizing emotions from the voice than from the face. However, little is known about how children integrate vocal and facial cues to recognize an emotion, particularly during mid to late childhood. Furthermore, children with an autism spectrum disorder often show a reduced ability to recognize emotions, especially when integrating emotion from multiple modalities. The current preliminary study explored the ability of typically developing children aged 7-9 years to match emotional tones of voice to facial expressions and whether this ability varies according to the level of autism-like traits. Overall, children were the least accurate when matching happy and fearful voices to faces, commonly pairing happy voices with angry faces and fearful voices with sad faces. However, the level of autism-like traits was not associated with matching accuracy. These results suggest that 7- to 9-year-old children have difficulty in integrating vocal and facial emotional expressions but that differences in cross-modal emotion matching in relation to the broader autism phenotype are not evident in this task for this age group with the current sample.


Subject(s)
Auditory Perception/physiology , Autism Spectrum Disorder/physiopathology , Child Development/physiology , Emotions/physiology , Facial Expression , Facial Recognition/physiology , Social Perception , Child , Female , Humans , Male
12.
J Int Neuropsychol Soc ; 26(2): 226-240, 2020 02.
Article in English | MEDLINE | ID: mdl-31727185

ABSTRACT

BACKGROUND: Language and communication are fundamental to the human experience, and, traditionally, spoken language is studied as an isolated skill. However, before propositional language (i.e., spontaneous, voluntary, novel speech) can be produced, propositional content or 'ideas' must be formulated. OBJECTIVE: This review highlights the role of broader cognitive processes, particularly 'executive attention', in the formulation of propositional content (i.e., 'ideas') for propositional language production. CONCLUSIONS: Several key lines of evidence converge to suggest that the formulation of ideas for propositional language production draws on executive attentional processes. Larger-scale clinical research has demonstrated a link between attentional processes and language, while detailed case studies of neurological patients have elucidated specific idea formulation mechanisms relating to the generation, selection and sequencing of ideas for expression. Furthermore, executive attentional processes have been implicated in the generation of ideas for propositional language production. Finally, neuroimaging studies suggest that a widely distributed network of brain regions, including parts of the prefrontal and parietal cortices, supports propositional language production. IMPLICATIONS: Theoretically driven experimental research studies investigating mechanisms involved in the formulation of ideas are lacking. We suggest that novel experimental approaches are needed to define the contribution of executive attentional processes to idea formulation, from which comprehensive models of spoken language production can be developed. Clinically, propositional language impairments should be considered in the context of broader executive attentional deficits.


Subject(s)
Aging/physiology , Aphasia/physiopathology , Attention/physiology , Concept Formation/physiology , Executive Function/physiology , Psycholinguistics , Verbal Behavior/physiology , Humans
13.
Cell Immunol ; 345: 103962, 2019 11.
Article in English | MEDLINE | ID: mdl-31582169

ABSTRACT

Previous in vivo studies established that inactivated Francisella tularensis immune complexes (mAb-iFt) are a more protective vaccine against lethal tularemia than iFt alone. Subsequent in vitro studies revealed enhanced DC maturation marker expression with mAb-iFt stimulation. The goal of this study was to determine the mechanism of enhanced DC maturation. Multiparameter analysis of surface marker expression and cytokine secretion demonstrates a requirement for FcγR signaling in enhanced DC maturation. MyD88 was also found to be essential for heightened DC maturation, implicating MyD88-dependent TLRs in DC maturation. Upon further study, we discovered that TLRs 2 & 4 drive cytokine secretion, but surprisingly TLR9 is required for DC maturation marker upregulation. These studies reveal a separation of DC cytokine and maturation marker induction pathways and demonstrate that FcγR-TLR/MyD88 synergy underlies the enhanced dendritic cell maturation in response to the mAb-iFt vaccine.


Subject(s)
Cell Differentiation/immunology , Dendritic Cells/immunology , Myeloid Differentiation Factor 88/immunology , Receptors, IgG/immunology , Toll-Like Receptor 9/immunology , Animals , Antibodies, Monoclonal/immunology , Bacterial Vaccines/immunology , Cytokines/immunology , Cytokines/metabolism , Dendritic Cells/metabolism , Francisella tularensis/immunology , Humans , Mice, Inbred C57BL , Mice, Knockout , Mice, Transgenic , Myeloid Differentiation Factor 88/metabolism , Receptors, IgG/genetics , Toll-Like Receptor 2/genetics , Toll-Like Receptor 2/immunology , Toll-Like Receptor 2/metabolism , Toll-Like Receptor 4/genetics , Toll-Like Receptor 4/immunology , Toll-Like Receptor 4/metabolism , Toll-Like Receptor 9/genetics , Toll-Like Receptor 9/metabolism , Tularemia/immunology , Tularemia/microbiology
14.
Behav Processes ; 164: 193-200, 2019 Jul.
Article in English | MEDLINE | ID: mdl-31075385

ABSTRACT

Research examining children's understanding of emotional expressions has generally used static, isolated facial expressions presented in a non-interactive context. However, these tasks do not resemble children's experiences with expressions in daily life, where they must attend to a range of information, including others' facial expressions, movements, and the situation surrounding the expression. In this research, we examine the development of visual attention to another's emotional expressions during a live interaction. Via an eye-tracker, children (4-11 years old) and adults viewed an experimenter open a series of opaque boxes and make an expression (happiness, sadness, fear, or disgust) based on the object inside. Participants determined which of four possible objects (stickers, a broken toy, a spider, or dog poop) was in the box. We examined the proportion of the trial in which participants looked to three areas of the face (the eyes, mouth, and nose area), and the available contextual information (the box held by the experimenter, the four objects). Although children spent less time looking to the face than adults did, their pattern of visual attention within the face and to object AOIs did not differ from that of adults. Finally, for adults, increased accuracy was linked to spending less time looking to the objects whereas increased accuracy for children was not strongly linked to any emotion cue. These data indicate that although children spend less time looking to the face during live interactions than adults do, the proportion of time spent looking to areas of the face and context are generally adult-like.


Subject(s)
Emotions/physiology , Eye Movements/physiology , Facial Expression , Judgment , Perception/physiology , Adult , Attention , Child , Child, Preschool , Female , Humans , Male
15.
Horm Behav ; 113: 55-66, 2019 07.
Article in English | MEDLINE | ID: mdl-30978339

ABSTRACT

Mating strategy theories assert that women's preferences for androgen dependent traits in men are stronger when the costs of reduced paternal investment are lowest. Past research has shown that preferences for facial masculinity are stronger among nulliparous and non-pregnant women than pregnant or parous women. In two studies, we examine patterns in women's preferences for men's facial hair - likely the most visually conspicuous and sexually dimorphic of men's secondary sexual traits - when evaluating men's masculinity, dominance, age, fathering, and attractiveness. Two studies were conducted among heterosexual pregnant women, mothers, non-contractive and contraceptive users. Study 1 used a between-subjects sample (N = 2103) and found that mothers had significantly higher preferences for beards when judging fathering than all other women. Pregnant women and mothers also judged beards as more masculine and older, but less attractive, than non-contractive and contraceptive users. Parous women judged beards higher for age, masculinity and fathering, but lower for attractiveness, than nulliparous women. Irrespective of reproductive status, beards were judged as looking more dominant than clean-shaven faces. Study 2 used a within-subjects design (N = 53) among women surveyed during pregnancy and three months post-partum. Judgments of parenting skills were higher for bearded stimuli during pregnancy among women having their first baby, whereas among parous women parenting skills judgments for bearded stimuli were higher post-partum. Our results suggest that mothers are sensitive to beardedness as a masculine secondary sexual characteristic that may denote parental investment, providing evidence that women's mate preferences could reflect sexual selection for direct benefits.


Subject(s)
Cues , Face , Hair , Judgment/physiology , Mothers/psychology , Paternal Behavior/psychology , Adult , Choice Behavior/physiology , Fathers/psychology , Female , Humans , Male , Masculinity , Parity/physiology , Pregnancy , Sexual Partners , Young Adult
16.
Psychol Sci ; 30(5): 728-738, 2019 05.
Article in English | MEDLINE | ID: mdl-30908116

ABSTRACT

The beard is arguably one of the most obvious signals of masculinity in humans. Almost 150 years ago, Darwin suggested that beards evolved to communicate formidability to other males, but no studies have investigated whether beards enhance recognition of threatening expressions, such as anger. We found that the presence of a beard increased the speed and accuracy with which participants recognized displays of anger but not happiness (Experiment 1, N = 219). This effect was not due to negative evaluations shared by beardedness and anger or to negative stereotypes associated with beardedness, as beards did not facilitate recognition of another negative expression, sadness (Experiment 2, N = 90), and beards increased the rated prosociality of happy faces in addition to the rated masculinity and aggressiveness of angry faces (Experiment 3, N = 445). A computer-based emotion classifier reproduced the influence of beards on emotion recognition (Experiment 4). The results suggest that beards may alter perceived facial structure, facilitating rapid judgments of anger in ways that conform to evolutionary theory.


Subject(s)
Aggression/psychology , Agonistic Behavior/physiology , Emotions/physiology , Facial Recognition/physiology , Adult , Anger/physiology , Facial Expression , Female , Humans , Male , Masculinity , Middle Aged , Reaction Time/physiology , Sex Characteristics , Socialization
17.
J Exp Child Psychol ; 180: 19-38, 2019 04.
Article in English | MEDLINE | ID: mdl-30611111

ABSTRACT

Adults' first impressions of others are influenced by subtle facial expressions; happy faces are perceived as high in trustworthiness, whereas angry faces are rated as low in trustworthiness and high in threat and dominance. Little is known about the influence of emotional expressions on children's first impressions. Here we examined the influence of subtle expressions of happiness, anger, and fear on children's implicit judgments of trustworthiness and dominance with the aim of providing novel insights about both the development of first impressions and whether children are able to utilize emotional expressions when making implicit, rather than explicit, judgments of traits. In the context of a computerized storybook, children (4- to 11-year-olds) and adults selected one of two twins (two images of the same identity displaying different emotional expressions) to help them face a challenge; some challenges required a trustworthy partner, and others required a dominant partner. One twin posed a neutral expression, and the other posed a subtle emotional expression of happiness, fear, or anger. Whereas adults were more likely to select a happy partner on trust trials than on dominance trials and were more likely to select an angry partner on dominance trials than on trust trials, we found no evidence that children's choices reflected a combined influence of desirable trait and emotion. Follow-up experiments involving explicit trait judgments, explicit emotion recognition, and implicit first impression judgments in the context of intense emotional expressions provide valuable insights into the slow development of implicit trait judgments based on first impressions.


Subject(s)
Emotions/physiology , Facial Expression , Trust/psychology , Adult , Anger/physiology , Attitude , Child , Child, Preschool , Fear/physiology , Female , Happiness , Humans , Judgment/physiology , Male , Problem Solving , Social Dominance
18.
Cogn Emot ; 33(6): 1144-1154, 2019 09.
Article in English | MEDLINE | ID: mdl-30563417

ABSTRACT

We examined the utility of a gaze cueing paradigm to examine sensitivity to differences among negatively valenced expressions. Participants judged target stimuli (dangerous or safe), the location of which was cued by the gaze direction of a central face. Dawel et al. reported that gaze cueing effects (faster response times on valid vs. invalid trials) were larger when the central face displayed fear than when it displayed happiness. Our aim was to determine whether this effect was specific to fear, to all threat-related expressions (fear, anger), or to all negatively valenced expressions (fear, anger, sadness, disgust) with the aim of using this protocol to study the development of implicit discrimination of negatively valenced expressions. Across five experiments in which we varied the number of models (1 vs. 4), the number of expressions (2 vs. 5), and the country of residence of participants (Canada vs. Australia) we found no evidence that the magnitude of gaze cueing effects is modulated by expression. We discuss our failure to replicate in the context of the broader literature.


Subject(s)
Cues , Emotions/physiology , Facial Expression , Fixation, Ocular/physiology , Judgment/physiology , Adolescent , Adult , Anger/physiology , Australia , Canada , Disgust , Fear/physiology , Fear/psychology , Female , Happiness , Humans , Male , Reaction Time/physiology , Students/psychology , Students/statistics & numerical data , Young Adult
19.
Neuropsychologia ; 119: 349-362, 2018 10.
Article in English | MEDLINE | ID: mdl-30195029

ABSTRACT

Energization is the process of initiating and sustaining a response over time. It has been described as one of three key "supervisory" attentional control processes associated with the frontal lobes. Attentional mechanisms, such as energization, are critical for a range of cognitive functions, such as spontaneous speech and other higher-order tasks. We aimed to investigate the process of energization in a case series of patients with progressive supranuclear palsy (PSP). Patients with a diagnosis of PSP (N = 5), patient controls with a neurodegenerative condition (Alzheimer's disease N = 3, frontotemporal dementia N = 2) and healthy older adult controls (N = 30) were assessed on a standard neuropsychological battery, including executive tasks and standard attention and language tests. Energization was investigated using word fluency tasks, samples of spontaneous speech and an experimental button-pressing concentration task. Response rates for the word fluency, spontaneous speech and concentration tasks were separated into time periods, in order to compare response rates at different points across the tasks (e.g., first 15 s vs. last 45 s in a 60 s task). Four PSP patients showed a clear response pattern indicative of a decrease in energization. Healthy and patient controls remained consistent in their responding over time. Understanding how these underlying processes are impaired in PSP can ultimately inform intervention and management strategies, and has theoretical implications for models of spoken language production.


Subject(s)
Attention , Speech , Supranuclear Palsy, Progressive/psychology , Aged , Aged, 80 and over , Alzheimer Disease/psychology , Brain/diagnostic imaging , Executive Function , Female , Frontotemporal Dementia/psychology , Humans , Male , Middle Aged , Supranuclear Palsy, Progressive/diagnostic imaging
20.
J Autism Dev Disord ; 48(8): 2611-2618, 2018 08.
Article in English | MEDLINE | ID: mdl-29492733

ABSTRACT

The current study investigated whether those with higher levels of autism-like traits process emotional information from speech differently to those with lower levels of autism-like traits. Neurotypical adults completed the autism-spectrum quotient and an emotional priming task. Vocal primes with varied emotional prosody, semantics, or a combination, preceded emotional target faces. Prime-target pairs were congruent or incongruent in their emotional content. Overall, congruency effects were found for combined prosody-semantic primes, however no congruency effects were found for semantic or prosodic primes alone. Further, those with higher levels of autism-like traits were not influenced by the prime stimuli. These results suggest that failure to integrate emotional information across modalities may be characteristic of the broader autism phenotype.


Subject(s)
Autistic Disorder/psychology , Facial Recognition , Semantics , Speech Perception , Adult , Cognition , Cues , Emotions , Female , Humans , Male , Phenotype , Task Performance and Analysis , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...