Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 16 de 16
1.
Cogn Affect Behav Neurosci ; 23(5): 1322-1345, 2023 10.
Article En | MEDLINE | ID: mdl-37526901

While a delicious dessert being presented to us may elicit strong feelings of happiness and excitement, the same treat falling slowly away can lead to sadness and disappointment. Our emotional response to the item depends on its visual motion direction. Despite this importance, it remains unclear whether (and how) cortical areas devoted to decoding motion direction represents or integrates emotion with perceived motion direction. Motion-selective visual area V5/MT+ sits, both functionally and anatomically, at the nexus of dorsal and ventral visual streams. These pathways, however, differ in how they are modulated by emotional cues. The current study was designed to disentangle how emotion and motion perception interact, as well as use emotion-dependent modulation of visual cortices to understand the relation of V5/MT+ to canonical processing streams. During functional magnetic resonance imaging (fMRI), approaching, receding, or static motion after-effects (MAEs) were induced on stationary positive, negative, and neutral stimuli. An independent localizer scan was conducted to identify the visual-motion area V5/MT+. Through univariate and multivariate analyses, we demonstrated that emotion representations in V5/MT+ share a more similar response profile to that observed in ventral visual than dorsal, visual structures. Specifically, V5/MT+ and ventral structures were sensitive to the emotional content of visual stimuli, whereas dorsal visual structures were not. Overall, this work highlights the critical role of V5/MT+ in the representation and processing of visually acquired emotional content. It further suggests a role for this region in utilizing affectively salient visual information to augment motion perception of biologically relevant stimuli.


Motion Perception , Visual Cortex , Humans , Motion Perception/physiology , Magnetic Resonance Imaging/methods , Visual Cortex/diagnostic imaging , Visual Cortex/physiology , Emotions , Happiness , Photic Stimulation/methods , Visual Pathways/physiology
2.
Ann N Y Acad Sci ; 1528(1): 29-41, 2023 10.
Article En | MEDLINE | ID: mdl-37596987

An emerging view in cognitive neuroscience holds that the extraction of emotional relevance from sensory experience extends beyond the centralized appraisal of sensation in associative brain regions, including frontal and medial-temporal cortices. This view holds that sensory information can be emotionally valenced from the point of contact with the world. This view is supported by recent research characterizing the human affiliative touch system, which carries signals of soft, stroking touch to the central nervous system and is mediated by dedicated C-tactile afferent receptors. This basic scientific research on the human affiliative touch system is informed by, and informs, technology design for communicating and regulating emotion through touch. Here, we review recent research on the basic biology and cognitive neuroscience of affiliative touch, its regulatory effects across the lifespan, and the factors that modulate it. We further review recent work on the design of haptic technologies, devices that stimulate the affiliative touch system, such as wearable technologies that apply the sensation of soft stroking or other skin-to-skin contact, to promote physiological regulation. We then point to future directions in interdisciplinary research aimed at both furthering scientific understanding and application of haptic technology for health and wellbeing.


Stroke , Touch Perception , Humans , Touch/physiology , Haptic Technology , Touch Perception/physiology , Skin , Emotions/physiology , Physical Stimulation
3.
eNeuro ; 10(1)2023 01.
Article En | MEDLINE | ID: mdl-36549914

The ability to interrogate specific representations in the brain, determining how, and where, difference sources of information are instantiated can provide invaluable insight into neural functioning. Pattern component modeling (PCM) is a recent analytic technique for human neuroimaging that allows the decomposition of representational patterns in brain into contributing subcomponents. In the current study, we present a novel PCM variant that tracks the contribution of prespecified representational patterns to brain representation across areas, thus allowing hypothesis-guided employment of the technique. We apply this technique to investigate the contributions of hedonic and nonhedonic information to the neural representation of tactile experience. We applied aversive pressure (AP) and appetitive brush (AB) to stimulate distinct peripheral nerve pathways for tactile information (C-/CT-fibers, respectively) while patients underwent functional magnetic resonance imaging (fMRI) scanning. We performed representational similarity analyses (RSAs) with pattern component modeling to dissociate how discriminatory versus hedonic tactile information contributes to population code representations in the human brain. Results demonstrated that information about appetitive and aversive tactile sensation is represented separately from nonhedonic tactile information across cortical structures. This also demonstrates the potential of new hypothesis-guided PCM variants to help delineate how information is instantiated in the brain.


Brain Mapping , Brain , Humans , Brain Mapping/methods , Brain/diagnostic imaging , Brain/physiology , Touch , Magnetic Resonance Imaging/methods , Neuroimaging
4.
eNeuro ; 2022 Aug 25.
Article En | MEDLINE | ID: mdl-36028330

In times of stress or danger, the autonomic nervous system (ANS) signals the fight or flight response. A canonical function of ANS activity is to globally mobilize metabolic resources, preparing the organism to respond to threat. Yet a body of research has demonstrated that, rather than displaying a homogenous pattern across the body, autonomic responses to arousing events - as measured through changes in electrodermal activity (EDA) - can differ between right and left body locations. Surprisingly, an attempt to identify a function of ANS asymmetry consistent with its metabolic role has not been investigated. In the current study, we investigated whether asymmetric autonomic responses could be induced through limb-specific aversive stimulation. Participants were given mild electric stimulation to either the left or right arm while EDA was monitored bilaterally. In a group-level analyses, an ipsilateral EDA response bias was observed, with increased EDA response in the hand adjacent to the stimulation. This effect was observable in ∼50% of individual particpants. These results demonstrate that autonomic output is more complex than canonical interpretations suggest. We suggest that, in stressful situations, autonomic outputs can prepare either the whole-body fight or flight response, or a simply a limb-localized flick, which can effectively neutralize the threat while minimizing global resource consumption. These findings are consistent with recent theories proposing evolutionary leveraging of neural structures organized to mediate sensory responses for processing of cognitive emotional cues.Significance statementThe present study constitutes novel evidence for an autonomic nervous response specific to the side of the body exposed to direct threat. We identify a robust pattern of electrodermal response at the body location that directly receives aversive tactile stimulation. Thus, we demonstrate for the first time in contemporary research that the ANS is capable of location-specific outputs within single effector organs in response to small scale threat. This extends the canonical view of the role of ANS responses in stressful or dangerous stresses - that of provoking a 'fight or flight' response - suggesting a further role of this system: preparation of targeted limb-specific action, i.e., a flick.

5.
J Exp Psychol Hum Percept Perform ; 47(8): 1113-1131, 2021 Aug.
Article En | MEDLINE | ID: mdl-34516217

There is substantial evidence demonstrating that emotional information influences perception. Yet across studies, findings of how it does so have been highly inconsistent. In particular, emotional context (task-unrelated emotional information in the environment) has a variable influence on spatial perceptual accuracy, sometimes improving and sometimes impairing the ability to localize objects. Here, we tested the hypothesis that the heterogenous nature of emotional influences on target localization is influenced by the specific combination of sensory modalities used in the task. In the present series of experiments, we used a cross-modal localization task to identify how emotional context influences the accuracy of spatial perception. By presenting nonemotional target stimuli alongside emotional nonspatial distractor items (facial expressions or vocalizations), we were able to systematically investigate how emotional stimuli presented to individual sensory modalities acted to modulate spatial perception at distinct stages of perception and action. In three separate experiments, distractor items were presented prior to or during target presentation or after presentation during the localization response. Intramodal emotional distractors influenced localization accuracy when they overlapped in timing with targets, and the direction of this effect was both modality and valence specific (Experiment I). Additionally, targeted contrasts revealed that auditory but not visual emotional distractors influenced localization of visual targets when presented during the behavioral response, with negative cues improving localization accuracy compared to neutral or positive cues (Experiment II). We suggest such effects reflect distinct patterns of unimodal versus multimodal processing in brain regions involved in early versus late stages of perceptual processing. (PsycInfo Database Record (c) 2021 APA, all rights reserved).


Attention , Emotions , Cues , Facial Expression , Humans , Space Perception
6.
PLoS One ; 16(1): e0245330, 2021.
Article En | MEDLINE | ID: mdl-33444407

Nurses and surgeons must identify and handle specialized instruments with high temporal and spatial precision. It is crucial that they are trained effectively. Traditional training methods include supervised practices and text-based study, which may expose patients to undue risk during practice procedures and lack motor/haptic training respectively. Tablet-based simulations have been proposed to mediate some of these limitations. We implemented a learning task that simulates surgical instrumentation nomenclature encountered by novice perioperative nurses. Learning was assessed following training in three distinct conditions: tablet-based simulations, text-based study, and real-world practice. Immediately following a 30-minute training period, instrument identification was performed with comparable accuracy and response times following tablet-based versus text-based training, with both being inferior to real-world practice. Following a week without practice, response times were equivalent between real-world and tablet-based practice. While tablet-based training does not achieve equivalent results in instrument identification accuracy as real-world practice, more practice repetitions in simulated environments may help reduce performance decline. This project has established a technological framework to assess how we can implement simulated educational environments in a maximally beneficial manner.


Clinical Competence , General Surgery , Health Personnel/education , Virtual Reality , Adolescent , Adult , Female , General Surgery/education , General Surgery/instrumentation , Humans , Learning , Male , Young Adult
7.
Trends Cogn Sci ; 24(11): 916-929, 2020 11.
Article En | MEDLINE | ID: mdl-32917534

Emotional appraisal in humans is often considered a centrally mediated process by which sensory signals, void of emotional meaning, are assessed by integrative brain structures steps removed from raw sensation. We review emerging evidence that the emotional value of the environment is coded by nonvisual sensory systems as early as the sensory receptors and that these signals inform the emotional state of an organism independent of sensory cortical processes. We further present evidence for cross-species conservation of sensory projections to central emotion-processing brain regions. Based on this, we argue not only that emotional appraisal is a decentralized process, but that all human emotional experience may reflect the sensory experience of our ancestors.


Emotions , Sensation , Animals , Brain/physiology , Humans
8.
Neurology ; 95(19): e2635-e2647, 2020 11 10.
Article En | MEDLINE | ID: mdl-32963103

OBJECTIVE: To determine whether intranasal oxytocin, alone or in combination with instructed mimicry of facial expressions, would augment neural activity in patients with frontotemporal dementia (FTD) in brain regions associated with empathy, emotion processing, and the simulation network, as indexed by blood oxygen-level dependent (BOLD) signal during fMRI. METHODS: In a placebo-controlled, randomized crossover design, 28 patients with FTD received 72 IU intranasal oxytocin or placebo and then completed an fMRI facial expression mimicry task. RESULTS: Oxytocin alone and in combination with instructed mimicry increased activity in regions of the simulation network and in limbic regions associated with emotional expression processing. CONCLUSIONS: The findings demonstrate latent capacity to augment neural activity in affected limbic and other frontal and temporal regions during social cognition in patients with FTD, and support the promise and need for further investigation of these interventions as therapeutics in FTD. CLINICALTRIALSGOV IDENTIFIER: NCT01937013. CLASSIFICATION OF EVIDENCE: This study provides Class III evidence that a single dose of 72 IU intranasal oxytocin augments BOLD signal in patients with FTD during viewing of emotional facial expressions.


Brain/diagnostic imaging , Emotions , Facial Expression , Frontotemporal Dementia/diagnostic imaging , Imitative Behavior/physiology , Oxytocics/pharmacology , Oxytocin/pharmacology , Administration, Intranasal , Aged , Brain/drug effects , Brain/physiopathology , Cross-Over Studies , Empathy , Female , Frontotemporal Dementia/physiopathology , Functional Neuroimaging , Humans , Magnetic Resonance Imaging , Male , Middle Aged
9.
Anat Sci Educ ; 12(1): 32-42, 2019 Jan.
Article En | MEDLINE | ID: mdl-29603656

Research suggests that spatial ability may predict success in complex disciplines including anatomy, where mastery requires a firm understanding of the intricate relationships occurring along the course of veins, arteries, and nerves, as they traverse through and around bones, muscles, and organs. Debate exists on the malleability of spatial ability, and some suggest that spatial ability can be enhanced through training. It is hypothesized that spatial ability can be trained in low-performing individuals through visual guidance. To address this, training was completed through a visual guidance protocol. This protocol was based on eye-movement patterns of high-performing individuals, collected via eye-tracking as they completed an Electronic Mental Rotations Test (EMRT). The effects of guidance were evaluated using 33 individuals with low mental rotation ability, in a counterbalanced crossover design. Individuals were placed in one of two treatment groups (late or early guidance) and completed both a guided, and an unguided EMRT. A third group (no guidance/control) completed two unguided EMRTs. All groups demonstrated an increase in EMRT scores on their second test (P < 0.001); however, an interaction was observed between treatment and test iteration (P = 0.024). The effect of guidance on scores was contingent on when the guidance was applied. When guidance was applied early, scores were significantly greater than expected (P = 0.028). These findings suggest that by guiding individuals with low mental rotation ability "where" to look early in training, better search approaches may be adopted, yielding improvements in spatial reasoning scores. It is proposed that visual guidance may be applied in spatial fields, such as STEMM (science, technology, engineering, mathematics and medicine), surgery, and anatomy to improve student's interpretation of visual content. Anat Sci Educ. © 2018 American Association of Anatomists.


Anatomy/education , Cues , Health Occupations/education , Problem Solving/physiology , Students, Health Occupations/psychology , Academic Performance/statistics & numerical data , Comprehension/physiology , Cross-Over Studies , Eye Movements/physiology , Female , Humans , Male , Spatial Navigation/physiology , Students, Health Occupations/statistics & numerical data , Time Factors , Visual Perception/physiology
10.
Exp Brain Res ; 236(4): 945-953, 2018 04.
Article En | MEDLINE | ID: mdl-29374776

Emotion can have diverse effects on behaviour and perception, modulating function in some circumstances, and sometimes having little effect. Recently, it was identified that part of the heterogeneity of emotional effects could be due to a dissociable representation of emotion in dual pathway models of sensory processing. Our previous fMRI experiment using traditional univariate analyses showed that emotion modulated processing in the auditory 'what' but not 'where' processing pathway. The current study aims to further investigate this dissociation using a more recently emerging multi-voxel pattern analysis searchlight approach. While undergoing fMRI, participants localized sounds of varying emotional content. A searchlight multi-voxel pattern analysis was conducted to identify activity patterns predictive of sound location and/or emotion. Relative to the prior univariate analysis, MVPA indicated larger overlapping spatial and emotional representations of sound within early secondary regions associated with auditory localization. However, consistent with the univariate analysis, these two dimensions were increasingly segregated in late secondary and tertiary regions of the auditory processing streams. These results, while complimentary to our original univariate analyses, highlight the utility of multiple analytic approaches for neuroimaging, particularly for neural processes with known representations dependent on population coding.


Auditory Pathways/physiology , Auditory Perception/physiology , Brain Mapping/methods , Cerebral Cortex/physiology , Emotions/physiology , Space Perception/physiology , Adult , Auditory Pathways/diagnostic imaging , Cerebral Cortex/diagnostic imaging , Female , Humans , Magnetic Resonance Imaging , Male , Sound Localization/physiology , Young Adult
11.
Anat Sci Educ ; 10(6): 528-537, 2017 Nov.
Article En | MEDLINE | ID: mdl-28371467

Individuals with an aptitude for interpreting spatial information (high mental rotation ability: HMRA) typically master anatomy with more ease, and more quickly, than those with low mental rotation ability (LMRA). This article explores how visual attention differs with time limits on spatial reasoning tests. Participants were assorted to two groups based on their mental rotation ability scores and their eye movements were collected during these tests. Analysis of salience during testing revealed similarities between MRA groups in untimed conditions but significant differences between the groups in the timed one. Question-by-question analyses demonstrate that HMRA individuals were more consistent across the two timing conditions (κ = 0.25), than the LMRA (κ = 0.013). It is clear that the groups respond to time limits differently and their apprehension of images during spatial problem solving differs significantly. Without time restrictions, salience analysis suggests LMRA individuals attended to similar aspects of the images as HMRA and their test scores rose concomitantly. Under timed conditions however, LMRA diverge from HMRA attention patterns, adopting inflexible approaches to visual search and attaining lower test scores. With this in mind, anatomical educators may wish to revisit some evaluations and teaching approaches in their own practice. Although examinations need to evaluate understanding of anatomical relationships, the addition of time limits may induce an unforeseen interaction of spatial reasoning and anatomical knowledge. Anat Sci Educ 10: 528-537. © 2017 American Association of Anatomists.


Anatomy/education , Attention/physiology , Educational Measurement/methods , Eye Movements/physiology , Problem Solving/physiology , Adult , Comprehension , Eye Movement Measurements , Female , Humans , Male , Students, Health Occupations/psychology , Time Factors
12.
Anat Sci Educ ; 10(3): 224-234, 2017 Jun.
Article En | MEDLINE | ID: mdl-27706927

Learning in anatomy can be both spatially and visually complex. Pedagogical investigations have begun exploration as to how spatial ability may mitigate learning. Emerging hypotheses suggests individuals with higher spatial reasoning may attend to images differently than those who are lacking. To elucidate attentional patterns associated with different spatial ability, eye movements were measured in individuals completing a timed electronic mental rotation test (EMRT). The EMRT was based on the line drawings of Shepherd and Metzler. Individuals deduced whether image pairs were rotations (same) or mirror images (different). It was hypothesized that individuals with high spatial ability (HSA) would demonstrate shorter average fixation durations during problem solving and attend to different features of the EMRT than low spatial ability (LSA) counterparts. Moreover, question response accuracy would be associated with fewer fixations and shorter average response times, regardless of spatial reasoning ability. Average fixation duration in the HSA group was shorter than LSA (F(1,8) = 7.99; P = 0.022). Importantly, HSA and LSA individuals looked to different regions of the EMRT images (Fisher Exact Test: 12.47; P = 0.018); attending to the same locations only 34% of the time. Correctly answered questions were characterized by fewer fixations per question (F(1, 8) = 18.12; P = 0.003) and shorter average response times (F(1, 8) = 23.89; P = 0.001). The results indicate that spatial ability may influence visual attention to salient areas of images and this may be key to problem solving processes for low spatial individuals. Anat Sci Educ 10: 224-234. © 2016 American Association of Anatomists.


Anatomy/education , Eye Movements , Reaction Time , Spatial Navigation , Students, Medical/psychology , Attention , Educational Measurement/methods , Eye Movement Measurements , Female , Humans , Learning , Male , Problem Solving , Visual Perception
13.
Anat Sci Educ ; 9(4): 357-66, 2016 Jul 08.
Article En | MEDLINE | ID: mdl-26599398

Mental rotation ability (MRA) is linked to academic success in the spatially complex Science, Technology, Engineering, Medicine, and Mathematics (STEMM) disciplines, and anatomical sciences. Mental rotation literature suggests that MRA may manifest in the movement of the eyes. Quantification of eye movement data may serve to distinguish MRA across individuals, and serve as a consideration when designing visualizations for instruction. It is hypothesized that high-MRA individuals will demonstrate fewer eye fixations, conduct shorter average fixation durations (AFD), and demonstrate shorter response times, than low-MRA individuals. Additionally, individuals with different levels of MRA will attend to different features of the block-figures presented in the electronic mental rotations test (EMRT). All participants (n = 23) completed the EMRT while metrics of eye movement were collected. The test required participants view pairs of three-dimensional (3D) shapes, and identify if the pair is rotated but identical, or two different structures. Temporal analysis revealed no significant correlations between response time, average fixation durations, or number of fixations and mental rotation ability. Further analysis of within-participant variability yielded a significant correlation for response time variability, but no correlation between AFD variability and variability in the number of fixations. Additional analysis of salience revealed that during problem solving, individuals of differing MRA attended to different features of the block images; suggesting that eye movements directed at salient features may contribute to differences in mental rotations ability, and may ultimately serve to predict success in anatomy. Anat Sci Educ 9: 357-366. © 2015 American Association of Anatomists.


Eye Movements , Spatial Processing , Adult , Female , Humans , Male , Rotation , Young Adult
14.
Exp Brain Res ; 232(12): 3719-26, 2014 Dec.
Article En | MEDLINE | ID: mdl-25113129

Considerable evidence suggests that emotional cues influence processing prioritization and neural representations of stimuli. Specifically, within the visual domain, emotion is known to impact ventral stream processes and ventral stream-mediated behaviours; it remains unclear, however, the extent to which emotion impacts dorsal stream processes. In the present study, participants localized a visual target stimulus embedded within a background array utilizing allocentric localization (requiring an object-centred representation of visual space to perform an action) and egocentric localization (requiring purely target-directed actions), which are thought to differentially rely on the ventral versus dorsal visual stream, respectively. Simultaneously, a task-irrelevant negative, positive or neutral sound was presented to produce an emotional context. In line with predictions, we found that during allocentric localization, response accuracy was enhanced in the context of negative compared to either neutral or positive sounds. In contrast, no significant effects of emotion were identified during egocentric localization. These results raise the possibility that negative emotional auditory contexts enhance ventral stream, but not dorsal stream, processing in the visual domain. Furthermore, this study highlights the complexity of emotion-cognition interactions, indicating how emotion can have a differential impact on almost identical overt behaviours that may be governed by distinct neurocognitive systems.


Emotions/physiology , Space Perception/physiology , Visual Perception/physiology , Adolescent , Adult , Cues , Female , Humans , Male , Neuropsychological Tests , Photic Stimulation , Psychomotor Performance/physiology , Young Adult
15.
Behav Brain Res ; 252: 396-404, 2013 Sep 01.
Article En | MEDLINE | ID: mdl-23769997

There are two current models of amygdala functioning with regard to identification of emotional expression. Classic models propose that the amygdala contributes to emotional expression recognition and empathy by encoding the level of threat or distress, and as such, responds greatest to more potent fearful cues. However, recent evidence suggests that the amygdala directs attention to relevant object features to disambiguate the stimulus (e.g., the eyes of a fearful face). The present study used fMRI to investigate amygdala functioning during the perception and identification of emotion in complex visual scenes. Participants later rated the images on levels of fear, disgust and arousal. These ratings were used to identify stimuli that were emotionally-ambiguous, emotionally-discrete, and non-emotional for each individual. A whole-brain and ROI approach was used to identify the nature of the amygdala response to visual scenes. Amygdala activity was associated with higher levels of fear in stimuli and was found to reflect the level of arousal in complex visual scenes. In contrast, no activity was observed that would indicate that the amygdala was modulated by emotional ambiguity when discriminating between fearful and disgusting visual scenes. These results are consistent with models that implicate the amygdala in the evaluation and representation of the intensity of fear, and imply that the functional contribution of the amygdala to deciphering threat in visual scenes likely extends beyond the search for emotionally salient features. The results also suggest that using attention to remedy emotion recognition abnormalities in at-risk populations with amygdala dysfunction may not address all key deficits associated with contributions of the amygdala to emotion and empathy.


Amygdala/physiology , Emotions/physiology , Facial Expression , Fear/psychology , Pattern Recognition, Visual/physiology , Adolescent , Amygdala/blood supply , Brain Mapping , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Oxygen/blood , Photic Stimulation , Young Adult
16.
Neuroimage ; 82: 295-305, 2013 Nov 15.
Article En | MEDLINE | ID: mdl-23711533

Auditory cortices can be separated into dissociable processing pathways similar to those observed in the visual domain. Emotional stimuli elicit enhanced neural activation within sensory cortices when compared to neutral stimuli. This effect is particularly notable in the ventral visual stream. Little is known, however, about how emotion interacts with dorsal processing streams, and essentially nothing is known about the impact of emotion on auditory stimulus localization. In the current study, we used fMRI in concert with individualized auditory virtual environments to investigate the effect of emotion during an auditory stimulus localization task. Surprisingly, participants were significantly slower to localize emotional relative to neutral sounds. A separate localizer scan was performed to isolate neural regions sensitive to stimulus location independent of emotion. When applied to the main experimental task, a significant main effect of location, but not emotion, was found in this ROI. A whole-brain analysis of the data revealed that posterior-medial regions of auditory cortex were modulated by sound location; however, additional anterior-lateral areas of auditory cortex demonstrated enhanced neural activity to emotional compared to neutral stimuli. The latter region resembled areas described in dual pathway models of auditory processing as the 'what' processing stream, prompting a follow-up task to generate an identity-sensitive ROI (the 'what' pathway) independent of location and emotion. Within this region, significant main effects of location and emotion were identified, as well as a significant interaction. These results suggest that emotion modulates activity in the 'what,' but not the 'where,' auditory processing pathway.


Auditory Cortex/physiology , Auditory Pathways/physiology , Brain Mapping , Emotions/physiology , Sound Localization/physiology , Acoustic Stimulation , Adult , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Young Adult
...