Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 57
Filter
1.
Neuroimage ; 289: 120561, 2024 Apr 01.
Article in English | MEDLINE | ID: mdl-38428551

ABSTRACT

Previous studies of vicarious touch suggest that we automatically simulate observed touch experiences in our own body representation including primary and secondary somatosensory cortex (SCx). However, whether these early sensory areas are activated in a reflexive manner and the extent with which such SCx activations represent touch qualities, like texture, remains unclear. We measured event-related potentials (ERPs) of SCx's hierarchical processing stages, which map onto successive somatosensory ERP components, to investigate the timing of vicarious touch effects. In the first experiment, participants (n = 43) merely observed touch or no-touch to a hand; in the second, participants saw different touch textures (soft foam and hard rubber) either touching a hand (other-directed) or they were instructed that the touch was self-directed and to feel the touch. Each touch sequence was followed by a go/no-go task. We probed SCx activity and isolated SCx vicarious touch activations from visual carry over effects. We found that vicarious touch conditions (touch versus no-touch and soft versus hard) did not modulate early sensory ERP components (i.e. P50, N80); but we found effects on behavioural responses to the subsequent go/no-go stimulus consistent with post-perceptual effects. When comparing other- with self-directed touch conditions, we found that early and mid-latency components (i.e. P50, N80, P100, N140) were modulated consistent with early SCx activations. Importantly, these early sensory activations were not modulated by touch texture. Therefore, SCx is purposely recruited when participants are instructed to attend to touch; but such activation only situates, rather than fully simulates, the seen tactile experience in SCx.


Subject(s)
Somatosensory Cortex , Touch Perception , Humans , Somatosensory Cortex/physiology , Evoked Potentials/physiology , Hand , Skin , Electroencephalography
2.
Cortex ; 167: 223-234, 2023 10.
Article in English | MEDLINE | ID: mdl-37573853

ABSTRACT

Somatosensory cortex (SCx) has been shown to crucially contribute to early perceptual processes when judging other's emotional facial expressions. Here, we investigated the specificity of SCx activity to angry, happy, sad and neutral emotions and the role of personality factors. We assessed participants' alexithymia (TAS-20) and depression (BDI) levels, their cardioceptive abilities and recorded changes in neural activity in a facial emotion judgment task. During the task, we presented tactile probes to reveal neural activity in SCx which was then isolated from visual carry-over responses. We further obtain SCx emotion effects by subtracting SCx activity elicited by neutral emotion expressions from angry, happy, and sad expressions. We find preliminary evidence for distinct modulations of SCx activity to angry and happy expressions. Moreover, the SCx anger response was predicted by individual differences in trait alexithymia. Thus, emotion expressions of others may be distinctly presented in the observer's neural body representation and may be shaped by their personality trait.


Subject(s)
Affective Symptoms , Facial Expression , Humans , Somatosensory Cortex , Emotions/physiology , Anger , Perception
3.
J Neurosci ; 42(11): 2298-2312, 2022 03 16.
Article in English | MEDLINE | ID: mdl-35064001

ABSTRACT

Consistent with current models of embodied emotions, this study investigates whether the somatosensory system shows reduced sensitivity to facial emotional expressions in autistic compared with neurotypical individuals, and whether these differences are independent from between-group differences in visual processing of facial stimuli. To investigate the dynamics of somatosensory activity over and above visual carryover effects, we recorded EEG activity from two groups of autism spectrum disorder (ASD) or typically developing (TD) humans (male and female), while they were performing a facial emotion discrimination task and a control gender task. To probe the state of the somatosensory system during face processing, in 50% of trials we evoked somatosensory activity by delivering task-irrelevant tactile taps on participants' index finger, 105 ms after visual stimulus onset. Importantly, we isolated somatosensory from concurrent visual activity by subtracting visual responses from activity evoked by somatosensory and visual stimuli. Results revealed significant task-dependent group differences in mid-latency components of somatosensory evoked potentials (SEPs). ASD participants showed a selective reduction of SEP amplitudes (P100) compared with TD during emotion task; and TD, but not ASD, showed increased somatosensory responses during emotion compared with gender discrimination. Interestingly, autistic traits, but not alexithymia, significantly predicted SEP amplitudes evoked during emotion, but not gender, task. Importantly, we did not observe the same pattern of group differences in visual responses. Our study provides direct evidence of reduced recruitment of the somatosensory system during emotion discrimination in ASD and suggests that this effect is not a byproduct of differences in visual processing.SIGNIFICANCE STATEMENT The somatosensory system is involved in embodiment of visually presented facial expressions of emotion. Despite autism being characterized by difficulties in emotion-related processing, no studies have addressed whether this extends to embodied representations of others' emotions. By dissociating somatosensory activity from visual evoked potentials, we provide the first evidence of reduced recruitment of the somatosensory system during emotion discrimination in autistic participants, independently from differences in visual processing between typically developing and autism spectrum disorder participants. Our study uses a novel methodology to reveal the neural dynamics underlying difficulties in emotion recognition in autism spectrum disorder and provides direct evidence that embodied simulation of others' emotional expressions operates differently in autistic individuals.


Subject(s)
Autism Spectrum Disorder , Autistic Disorder , Autism Spectrum Disorder/psychology , Autistic Disorder/psychology , Emotions/physiology , Evoked Potentials/physiology , Evoked Potentials, Somatosensory , Evoked Potentials, Visual , Facial Expression , Female , Humans , Male
4.
Multisens Res ; : 1-18, 2021 Feb 01.
Article in English | MEDLINE | ID: mdl-33535162

ABSTRACT

The concept of embodiment has been used in multiple scenarios, but in cognitive neuroscience it normally refers to the comprehension of the role of one's own body in the cognition of everyday situations and the processes involved in that perception. Multisensory research is gradually embracing the concept of embodiment, but the focus has mostly been concentrated upon audiovisual integration. In two experiments, we evaluated how the likelihood of a perceived stimulus to be embodied modulates visuotactile interaction in a Simultaneity Judgement task. Experiment 1 compared the perception of two visual stimuli with and without biological attributes (hands and geometrical shapes) moving towards each other, while tactile stimuli were provided on the palm of the participants' hand. Participants judged whether the meeting point of two periodically-moving visual stimuli was synchronous with the tactile stimulation in their own hands. Results showed that in the hand condition, the Point of Subjective Simultaneity (PSS) was significantly more distant to real synchrony (60 ms after the Stimulus Onset Asynchrony, SOA) than in the geometrical shape condition (45 ms after SOA). In experiment 2, we further explored the impact of biological attributes by comparing performance on two visual biological stimuli (hands and ears), that also vary in their motor and visuotactile properties. Results showed that the PSS was equally distant to real synchrony in both the hands and ears conditions. Overall, findings suggest that embodied visual biological stimuli may modulate visual and tactile multisensory interaction in simultaneity judgements.

5.
Cortex ; 134: 239-252, 2021 01.
Article in English | MEDLINE | ID: mdl-33307269

ABSTRACT

The ability to identify our own body is considered a pivotal marker of self-awareness. Previous research demonstrated that subjects are more efficient in the recognition of images representing self rather than others' body effectors (self-advantage). Here, we verified whether, at an electrophysiological level, bodily-self recognition modulates change detection responses. In a first EEG experiment (discovery sample), event-related potentials (ERPs) were elicited by a pair of sequentially presented visual stimuli (vS1; vS2), representing either the self-hand or other people's hands. In a second EEG experiment (replicating sample), together with the previously described visual stimuli, also a familiar hand was presented. Participants were asked to decide whether vS2 was identical or different from vS1. Accuracy and response times were collected. In both experiments, results confirmed the presence of the self-advantage: participants responded faster and more accurately when the self-hand was presented. ERP results paralleled behavioral findings. Anytime the self-hand was presented, we observed significant change detection responses, with a larger N270 component for vS2 different rather than identical to vS1. Conversely, when the self-hand was not included, and even in response to the familiar hand in Experiment 2, we did not find any significant modulation of the change detection responses. Overall our findings, showing behavioral self-advantage and the selective modulation of N270 for the self-hand, support the existence of a specific mechanism devoted to bodily-self recognition, likely relying on the multimodal (visual and sensorimotor) dimension of the bodily-self representation. We propose that such a multimodal self-representation may activate the salience network, boosting change detection effects specifically for the self-hand.


Subject(s)
Hand , Recognition, Psychology , Evoked Potentials , Humans , Reaction Time
7.
Neurosci Biobehav Rev ; 116: 508-518, 2020 09.
Article in English | MEDLINE | ID: mdl-32544541

ABSTRACT

Examining the processing of others' body-related information in the perceivers' brain (action observation) is a key topic in cognitive neuroscience. However, what happens beyond the perceptual stage, when the body is not within view and it is transformed into an associative form that can be stored, updated, and later recalled, remains poorly understood. Here we examine neurobehavioural evidence on the memory processing of visually perceived bodily stimuli (dynamic actions and images of bodies). The reviewed studies indicate that encoding and maintaining bodily stimuli in memory recruits the sensorimotor system. This process arises when bodily stimuli are either recalled through action recognition or reproduction. Interestingly, the memory capacity for these stimuli is rather limited: only 2 or 3 bodily stimuli can be simultaneously held in memory. Moreover, this process is disrupted by increasing concurrent bodily operations; i.e., moving one's body, seeing or memorising additional bodies. Overall, the evidence suggests that the neural circuitry allowing us to move and feel ourselves supports the encoding, retention, and memory recall of others' visually perceived bodies.


Subject(s)
Memory , Recognition, Psychology , Brain , Emotions
8.
Cogn Psychol ; 122: 101321, 2020 11.
Article in English | MEDLINE | ID: mdl-32592971

ABSTRACT

Decision-making is a fundamental human activity requiring explanation at the neurocognitive level. Current theoretical frameworks assume that, during sensory-based decision-making, the stimulus is sampled sequentially. The resulting evidence is accumulated over time as a decision variable until a threshold is reached and a response is initiated. Several neural signals, including the centroparietal positivity (CPP) measured from the human electroencephalogram (EEG), appear to display the accumulation-to-bound profile associated with the decision variable. Here, we evaluate the putative computational role of the CPP as a model-derived accumulation-to-bound signal, focussing on point-by-point correspondence between model predictions and data in order to go beyond simple summary measures like average slope. In two experiments, we explored the CPP under two manipulations (namely non-stationary evidence and probabilistic decision biases) that complement one another by targeting the shape and amplitude of accumulation respectively. We fit sequential sampling models to the behavioural data, and used the resulting parameters to simulate the decision variable, before directly comparing the simulated profile to the CPP waveform. In both experiments, model predictions deviated from our naïve expectations, yet showed similarities with the neurodynamic data, illustrating the importance of a formal modelling approach. The CPP appears to arise from brain processes that implement a decision variable (as formalised in sequential-sampling models) and may therefore inform our understanding of decision-making at both the representational and implementational levels of analysis, but at this point it is uncertain whether a single model can explain how the CPP varies across different kinds of task manipulation.


Subject(s)
Decision Making/physiology , Electroencephalography/methods , Recognition, Psychology/physiology , Adolescent , Adult , Brain/physiology , Brain Mapping/methods , Female , Humans , Male , Models, Neurological , Reaction Time/physiology , Young Adult
9.
Cortex ; 129: 11-22, 2020 08.
Article in English | MEDLINE | ID: mdl-32422421

ABSTRACT

The ability to experience others' emotional states is a key component in social interactions. Uniquely among sensorimotor regions, the somatosensory cortex (SCx) plays an especially important role in human emotion understanding. While distinct emotions are experienced in specific parts of the body, it remains unknown whether the SCx exhibits somatotopic activations to different emotional expressions. In the current study, we investigated if the affective response triggered by observing others' emotional face expressions leads to differential activations in SCx. Participants performed a visual facial emotion discrimination task while we measured changes in SCx topographic EEG activity by tactually stimulating two body-parts representative of the upper and lower limbs, the finger and the toe respectively. The results of the study showed an emotion specific response in the finger SCx when observing angry as opposed to sad emotional expressions, after controlling for carry-over effects of visual evoked activity. This dissociation to observed emotions was not present in toe somatosensory responses. Our results suggest that somatotopic activations of the SCx to discrete emotions might play a crucial role in understanding others' emotions.


Subject(s)
Emotions , Facial Expression , Discrimination, Psychological , Humans , Somatosensory Cortex
10.
Cortex ; 125: 332-344, 2020 04.
Article in English | MEDLINE | ID: mdl-32120169

ABSTRACT

Examining the processing of others' body-related information in the perceivers' brain across the neurotypical and clinical population is a key topic in the domain of cognitive neurosciences. We argue that beyond classical neuroimaging techniques and frequency analyses, methods that can be easily adapted to capture the fast processing of body-related information in the brain are needed. Here we introduce a novel method that allows this by measuring event-related potentials recorded with electroencephalography (ERPs-EEG). This method possesses known EEG advantages (low cost, high temporal resolution, established paradigms) plus an improvement of its main limitation; i.e., spatiotemporally smoothed resolution due to mixed neural sources. This occurs when participants are presented and process images of bodies/actions that recruit posterior visual cortices. Such stimulus-evoked activity may spread and mask the recording of simultaneous activity arising from sensorimotor brain areas, which also process body-related information. Therefore, it is difficult to dissociate the contributing role of different brain regions. To overcome this, we propose eliciting a combination of somatosensory, motor, and visual-evoked potentials during processing of body-related information (vs non-body-related). Next, brain activity from sensorimotor and visual systems can be dissociated by subtracting activity from trials containing only visual-evoked potentials to those trials containing either a mixture of visual and somatosensory or visual and motor-cortical potentials. This allows isolating visually driven neural activity in areas other than visual. To introduce this method, we revise recent work using this method, consider the processing of body-related stimuli in the brain, as well as outline key methodological aspects to-be-considered. This work provides a clear guideline to researchers interested or transitioning from behavioural to ERPs studies, offering the possibility to adapt well-established paradigms in the EEG realm to study others' body-related processing in the perceiver's own cortical body representation (e.g., examining classical EEG components in the social and embodiment frameworks).


Subject(s)
Visual Cortex , Visual Perception , Brain , Brain Mapping , Electroencephalography , Evoked Potentials , Humans
11.
Clin Neurophysiol ; 130(1): 85-92, 2019 01.
Article in English | MEDLINE | ID: mdl-30481650

ABSTRACT

OBJECTIVE: We investigated changes in attention mechanisms in people who report a high number of somatic symptoms which cannot be associated with a physical cause. METHOD: Based on scores on the Somatoform Disorder Questionnaire (SDQ-20; Nijenhuis et al., 1996) we compared two non-clinical groups, one with high symptoms on the SDQ-20 and a control group with low or no symptoms. We recorded EEG whilst participants performed an exogenous tactile attention task where they had to discriminate between tactile targets following a tactile cue to the same or opposite hand. RESULTS: The neural marker of attentional orienting to the body, the Late Somatosensory Negativity (LSN), was diminished in the high symptoms group and attentional modulation of touch processing was prolonged at mid and enhanced at later latency stages in this group. CONCLUSION: These results confirm that attentional processes are altered in people with somatic symptoms, even in a non-clinical group. Furthermore, the observed pattern fits explanations of changes in prior beliefs or expectations leading to diminished amplitudes of the marker of attentional orienting to the body (i.e. the LSN) and enhanced attentional gain of touch processing. SIGNIFICANCE: This study shows that high somatic symptoms are associated with neurocognitive attention changes.


Subject(s)
Attention/physiology , Electroencephalography/methods , Medically Unexplained Symptoms , Orientation/physiology , Touch/physiology , Adolescent , Adult , Electrophysiological Phenomena/physiology , Female , Functional Laterality/physiology , Humans , Male , Middle Aged , Random Allocation , Surveys and Questionnaires , Young Adult
12.
J Cogn Neurosci ; 31(2): 262-277, 2019 02.
Article in English | MEDLINE | ID: mdl-30277429

ABSTRACT

The neural dynamics underpinning binary perceptual decisions and their transformation into actions are well studied, but real-world decisions typically offer more than two response alternatives. How does decision-related evidence accumulation dynamically influence multiple action representations in humans? The heightened conservatism required in multiple compared with binary choice scenarios suggests a mechanism that compensates for increased uncertainty when multiple choices are present by suppressing baseline activity. Here, we tracked action representations using corticospinal excitability during four- and two-choice perceptual decisions and modeled them using a sequential sampling framework. We found that the predictions made by leaky competing accumulator models to accommodate multiple choices (i.e., reduced baseline activity to compensate increased uncertainty) were borne out by dynamic changes in human action representations. This suggests a direct and continuous influence of interacting evidence accumulators, each favoring a different decision alternative, on downstream corticospinal excitability during complex choice.


Subject(s)
Cerebral Cortex/physiology , Choice Behavior/physiology , Color Perception/physiology , Evoked Potentials, Motor/physiology , Motor Activity/physiology , Psychomotor Performance/physiology , Pyramidal Tracts/physiology , Transcranial Magnetic Stimulation , Adult , Electromyography , Female , Humans , Male , Young Adult
13.
eNeuro ; 5(3)2018.
Article in English | MEDLINE | ID: mdl-29951578

ABSTRACT

Evolutionary pressures suggest that choices should be optimized to maximize rewards, by appropriately trading speed for accuracy. This speed-accuracy tradeoff (SAT) is commonly explained by variation in just the baseline-to-boundary distance, i.e., the excursion, of accumulation-to-bound models of perceptual decision-making. However, neural evidence is not consistent with this explanation. A compelling account of speeded choice should explain both overt behavior and the full range of associated brain signatures. Here, we reconcile seemingly contradictory behavioral and neural findings. In two variants of the same experiment, we triangulated upon the neural underpinnings of the SAT in the human brain using both EEG and transcranial magnetic stimulation (TMS). We found that distinct neural signals, namely the event-related potential (ERP) centroparietal positivity (CPP) and a smoothed motor-evoked potential (MEP) signal, which have both previously been shown to relate to decision-related accumulation, revealed qualitatively similar average neurodynamic profiles with only subtle differences between SAT conditions. These signals were then modelled from behavior by either incorporating traditional boundary variation or utilizing a forced excursion. These model variants are mathematically equivalent, in terms of their behavioral predictions, hence providing identical fits to correct and erroneous reaction time distributions. However, the forced-excursion version instantiates SAT via a more global change in parameters and implied neural activity, a process conceptually akin to, but mathematically distinct from, urgency. This variant better captured both ERP and MEP neural profiles, suggesting that the SAT may be implemented via neural gain modulation, and reconciling standard modelling approaches with human neural data.


Subject(s)
Decision Making/physiology , Models, Neurological , Psychomotor Performance , Reaction Time , Adult , Electroencephalography , Evoked Potentials , Evoked Potentials, Motor , Female , Humans , Male , Transcranial Magnetic Stimulation , Young Adult
14.
Neuropsychologia ; 117: 75-83, 2018 08.
Article in English | MEDLINE | ID: mdl-29738793

ABSTRACT

Recent studies suggest that brain regions engaged in perception are also recruited during the consolidation interval of the percept in working memory (WM). Evidence for this comes from studies showing that maintaining arbitrary visual, auditory, and tactile stimuli in WM elicits recruitment of the corresponding sensory cortices. Here we investigate if encoding and WM maintenance of visually perceived body-related stimuli engage just visual regions, or additional sensorimotor regions that are classically associated with embodiment processes in studies of body and action perception. We developed a novel WM paradigm in which participants were asked to remember body and control non-body-related images. In half of the trials, visual-evoked activity that was time-locked to the sight of the stimuli allowed us to examine visual processing of the stimuli to-be-remembered (visual-only trials). In the other half of the trials we additionally elicited a task irrelevant key pressing during the consolidation interval of the stimuli in WM. This manipulation elicited motor-cortical potentials (MCPs) concomitant to visual processing (visual-motor trials). This design allowed us to dissociate motor activity depicted in the MCPs from concurrent visual processing by subtracting activity from the visual-only trials to the compound activity found in the visual-motor trials. After dissociating the MCPs from concomitant visual activity, the results show that only the body-related images elicited neural recruitment of sensorimotor regions over and above visual effects. Importantly, the number of body stimuli to-be-remembered (memory load) modulated this later motor cortical activity. The current observations link together research in embodiment and WM by suggesting that neural recruitment is driven by the nature of the information embedded in the percept.


Subject(s)
Evoked Potentials/physiology , Hand , Imagination , Memory, Short-Term/physiology , Motor Activity/physiology , Pattern Recognition, Visual/physiology , Adult , Analysis of Variance , Brain Mapping , Electroencephalography , Female , Humans , Male , Photic Stimulation , Reaction Time/physiology , Young Adult
15.
Neuropsychologia ; 84: 158-66, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26898371

ABSTRACT

Our brain constantly receives tactile information from the body's surface. We often only become aware of this information when directing our attention towards the body. Here, we report a study investigating the behavioural and neural response when selecting a target amongst distractor vibrations presented simultaneously to several locations either across the hands or body. Comparable visual search studies have revealed the N2pc as the neural correlate of visual selective attention. Analogously, we describe an enhanced negativity contralateral to the tactile target side. This effect is strongest over somatosensory areas and lasts approximately 200ms from the onset of the somatosensory N140 ERP component. Based on these characteristics we named this electrophysiological signature of attentional tactile target selection during tactile search the N140-central-contralateral (N140cc). Furthermore, we present supporting evince that the N140cc reflects attentional enhancement of target rather than suppression of distractor locations; the component was not reliably altered by distractor but rather by target location changes. Taken together, our findings present a novel electrophysiological marker of tactile search and show how attentional selection of touch operates by mainly enhancing task relevant locations within the somatosensory homunculus.


Subject(s)
Attention/physiology , Brain/physiology , Touch Perception/physiology , Adolescent , Adult , Brain Mapping , Electroencephalography , Evoked Potentials, Somatosensory , Female , Functional Laterality/physiology , Hand/physiology , Humans , Male , Neuropsychological Tests , Physical Stimulation/methods , Toes/physiology , Young Adult
16.
Psychophysiology ; 53(4): 507-17, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26695445

ABSTRACT

ERP studies investigating the control processes responsible for spatial orienting in touch have consistently observed that the anterior directing attention negativity (ADAN) elicited by an attention-directing cue is followed by a sustained negativity contralateral to the cued hand. Recent evidence suggested that the later negativity, labeled late somatotopic negativity (LSN), might reflect distinct neurocognitive processes from those associated with the ADAN. To investigate the functional meaning of the ADAN and LSN components, we measured ERPs elicited by bilateral tactile cues indicating to covertly shift tactile attention to the left or right hand. Participants performed two spatial attention tasks that differed only for the difficulty of the target/nontarget discrimination at attended locations. The LSN but not the ADAN was sensitive to our experimental manipulation of task difficulty, suggesting that this component might reflect sensory-specific preparatory processes prior to a forthcoming tactile stimulus.


Subject(s)
Attention/physiology , Brain/physiology , Functional Laterality/physiology , Orientation, Spatial/physiology , Space Perception/physiology , Touch Perception/physiology , Adolescent , Adult , Cues , Electroencephalography , Female , Humans , Male , Photic Stimulation , Psychomotor Performance , Young Adult
17.
Biol Psychol ; 109: 239-47, 2015 Jul.
Article in English | MEDLINE | ID: mdl-26101088

ABSTRACT

Directing one's gaze at a body part reduces detection speed and enhances the processing of tactile stimuli presented at the gazed location. Given the close links between spatial attention and the oculomotor system it is possible that these gaze- dependent modulations of touch are mediated by attentional mechanisms. To investigate this possibility, gaze direction and sustained tactile attention were orthogonally manipulated in the present study. Participants covertly attended to one hand to perform a tactile target-nontarget discrimination while they gazed at the same or opposite hand. Spatial attention resulted in enhancements of the somatosensory P100 and Nd components. In contrast, gaze resulted in modulations of the N140 component with more positive ERPs for gazed than non gazed stimuli. This dissociation in the pattern and timing of the effects of gaze and attention on somatosensory processing reveals that gaze and attention have independent effects on touch.


Subject(s)
Attention/physiology , Evoked Potentials, Somatosensory/physiology , Evoked Potentials/physiology , Space Perception/physiology , Touch Perception/physiology , Visual Perception/physiology , Adult , Electroencephalography , Eye Movements/physiology , Female , Humans , Male , Young Adult
18.
Front Psychol ; 6: 56, 2015.
Article in English | MEDLINE | ID: mdl-25688229
19.
Soc Cogn Affect Neurosci ; 10(10): 1316-22, 2015 Oct.
Article in English | MEDLINE | ID: mdl-25717074

ABSTRACT

Current models of emotion simulation propose that intentionally posing a facial expression can change one's subjective feelings, which in turn influences the processing of visual input. However, the underlying neural mechanism whereby one's facial emotion modulates the visual cortical responses to other's facial expressions remains unknown. To understand how one's facial expression affects visual processing, we measured participants' visual evoked potentials (VEPs) during a facial emotion judgment task of positive and neutral faces. To control for the effects of facial muscles on VEPs, we asked participants to smile (adopting an expression of happiness), to purse their lips (incompatible with smiling) or to pose with a neutral face, in separate blocks. Results showed that the smiling expression modulates face-specific visual processing components (N170/vertex positive potential) to watching other facial expressions. Specifically, when making a happy expression, neutral faces are processed similarly to happy faces. When making a neutral expression or pursing the lips, however, responses to neutral and happy face are significantly different. This effect was source localized within multisensory associative areas, angular gyrus, associative visual cortex and somatosensory cortex. We provide novel evidence that one's own emotional expression acts as a top-down influence modulating low-level neural encoding during facial perception.


Subject(s)
Emotions/physiology , Evoked Potentials, Visual , Facial Expression , Happiness , Visual Cortex/physiology , Visual Perception , Adult , Electroencephalography , Female , Humans , Male , Somatosensory Cortex
20.
Front Psychol ; 5: 683, 2014.
Article in English | MEDLINE | ID: mdl-25071653

ABSTRACT

Attentional selectivity in touch is modulated by the position of the body in external space. For instance, during endogenous attention tasks in which tactile stimuli are presented to the hands, the effect of attention is reduced when the hands are placed far apart than when they are close together and when the hands are crossed as compared to when they are placed in their anatomical position. This suggests that both somatotopic and external spatial reference frames coding the hands' locations contribute to the spatial selection of the relevant hand. Here we investigate whether tactile selection of hands is also modulated by the position of other body parts, not directly involved in tactile perception, such as eye-in-orbit (gaze direction). We asked participants to perform the same sustained tactile attention task while gazing laterally toward an eccentric fixation point (Eccentric gaze) or toward a central fixation point (Central gaze). Event-related potentials recorded in response to tactile non-target stimuli presented to the attended or unattended hand were compared as a function of gaze direction (Eccentric vs. Central conditions). Results revealed that attentional modulations were reduced in the Eccentric gaze condition as compared to the Central gaze condition in the time range of the Nd component (200-260 ms post-stimulus), demonstrating for the first time that the attentional selection of one of the hands is affected by the position of the eye in the orbit. Directing the eyes toward an eccentric position might be sufficient to create a misalignment between external and somatotopic frames of references reducing tactile attention. This suggests that the eye-in-orbit position contributes to the spatial selection of the task relevant body part.

SELECTION OF CITATIONS
SEARCH DETAIL
...