RESUMEN
In the study of bodily awareness, the predictive coding theory has revealed that our brain continuously modulates sensory experiences to integrate them into a unitary body representation. Indeed, during multisensory illusions (e.g., the rubber hand illusion, RHI), the synchronous stroking of the participant's concealed hand and a fake visible one creates a visuotactile conflict, generating a prediction error. Within the predictive coding framework, through sensory processing modulation, prediction errors are solved, inducing participants to feel as if touches originated from the fake hand, thus ascribing the fake hand to their own body. Here, we aimed to address sensory processing modulation under multisensory conflict, by disentangling somatosensory and visual stimuli processing that are intrinsically associated during the illusion induction. To this aim, we designed two EEG experiments, in which somatosensory- (SEPs; Experiment 1; N = 18; F = 10) and visual-evoked potentials (VEPs; Experiment 2; N = 18; F = 9) were recorded in human males and females following the RHI. Our results show that, in both experiments, ERP amplitude is significantly modulated in the illusion as compared with both control and baseline conditions, with a modality-dependent diametrical pattern showing decreased SEP amplitude and increased VEP amplitude. Importantly, both somatosensory and visual modulations occur in long-latency time windows previously associated with tactile and visual awareness, thus explaining the illusion of perceiving touch at the sight location. In conclusion, we describe a diametrical modulation of somatosensory and visual processing as the neural mechanism that allows maintaining a stable body representation, by restoring visuotactile congruency under the occurrence of multisensory conflicts.
Asunto(s)
Electroencefalografía , Potenciales Evocados Somatosensoriales , Potenciales Evocados Visuales , Ilusiones , Percepción Visual , Humanos , Masculino , Femenino , Adulto , Percepción Visual/fisiología , Potenciales Evocados Somatosensoriales/fisiología , Adulto Joven , Ilusiones/fisiología , Potenciales Evocados Visuales/fisiología , Percepción del Tacto/fisiología , Estimulación Luminosa/métodos , Conflicto Psicológico , Corteza Somatosensorial/fisiología , Imagen CorporalRESUMEN
Peripersonal space (PPS) is a highly plastic "invisible bubble" surrounding the body whose boundaries are mapped through multisensory integration. Yet, it is unclear how the spatial proximity to others alters PPS boundaries. Across five experiments (N = 80), by recording behavioral and electrophysiological responses to visuo-tactile stimuli, we demonstrate that the proximity to others induces plastic changes in the neural PPS representation. The spatial proximity to someone else's hand shrinks the portion of space within which multisensory responses occur, thus reducing the PPS boundaries. This suggests that PPS representation, built from bodily and multisensory signals, plastically adapts to the presence of conspecifics to define the self-other boundaries, so that what is usually coded as "my space" is recoded as "your space". When the space is shared with conspecifics, it seems adaptive to move the other-space away from the self-space to discriminate whether external events pertain to the self-body or to other-bodies.
RESUMEN
Compelling evidence from human and non-human studies suggests that responses to multisensory events are fastened when stimuli occur within the space surrounding the bodily self (i.e., peripersonal space; PPS). However, some human studies did not find such effect. We propose that these dissonant voices might actually uncover a specific mechanism, modulating PPS boundaries according to sensory regularities. We exploited a visuo-tactile paradigm, wherein participants provided speeded responses to tactile stimuli and rated their perceived intensity while ignoring simultaneous visual stimuli, appearing near the stimulated hand (VTNear) or far from it (VTFar; near the non-stimulated hand). Tactile stimuli could be delivered only to one hand (unilateral task) or to both hands randomly (bilateral task). Results revealed that a space-dependent multisensory enhancement (i.e., faster responses and higher perceived intensity in VTNear than VTFar) was present when highly predictable tactile stimulation induced PPS to be circumscribed around the stimulated hand (unilateral task). Conversely, when stimulus location was unpredictable (bilateral task), participants showed a comparable multisensory enhancement in both bimodal conditions, suggesting a PPS widening to include both hands. We propose that the detection of environmental regularities actively shapes PPS boundaries, thus optimizing the detection and reaction to incoming sensory stimuli.
Asunto(s)
Fenómenos Biológicos , Percepción del Tacto , Humanos , Espacio Personal , Motivación , Tacto/fisiología , Percepción del Tacto/fisiologíaRESUMEN
The ability to discriminate between one's own and others' body parts can be lost after brain damage, as in patients who misidentify someone else's hand as their own (pathological embodiment). Surprisingly, these patients do not use visual information to discriminate between the own and the alien hand. We asked whether this impaired visual discrimination emerges only in the ecological evaluation when the pathological embodiment is triggered by the physical alien hand (the examiner's one) or whether it emerges also when hand images are displayed on a screen. Forty right brain-damaged patients, with (E+ = 20) and without (E- = 20) pathological embodiment, and 24 healthy controls underwent two tasks in which stimuli depicting self and other hands was adopted. In the Implicit task, where participants judged which of two images matched a central target, the self-advantage (better performance with Self than Other stimuli) selectively emerges in controls, but not in patients. Moreover, E+ patients show a significantly lower performance with respect to both controls and E- patients, whereas E- patients were comparable to controls. In the Explicit task, where participants judged which stimuli belonged to themselves, both E- and E+ patients performed worst when compared to controls, but only E+ patients hyper-attributed others' hand to themselves (i.e., false alarms) as observed during the ecological evaluation. The VLSM revealed that SLF damage was significantly associated with the tendency of committing false alarm errors. We demonstrate that, in E+ patients, the ability to visually recognize the own body is lost, at both implicit and explicit level.
Asunto(s)
Imagen Corporal , Lesiones Encefálicas , Mano , Humanos , Percepción VisualRESUMEN
Individuals with autism spectrum conditions (ASC) are less susceptible to multisensory delusions, such as rubber hand illusion (RHI). Here, we investigate whether a monochannel variant of RHI is more effective in inducing an illusory feeling of ownership in ASC. To this aim, we exploit a non-visual variant of the RHI that, excluding vision, leverages only on the somatosensory channel. While the visual-tactile RHI does not alter the perceived hand position in ASC individuals, the tacto-tactile RHI effectively modulates proprioception to a similar extent as that found in typical development individuals. These findings suggest a more effective integration of multiple inputs originating from the same sensory channel in ASC, revealing a monochannel preference in this population.
Asunto(s)
Trastorno del Espectro Autista , Trastorno Autístico , Ilusiones , Percepción del Tacto , Imagen Corporal , Mano , Humanos , Propiocepción , Percepción VisualRESUMEN
During the rubber hand illusion (RHI), the synchronous stroking of the participants' concealed hand and a visible rubber hand induces a conflict among visuo-tactile inputs, leading healthy subjects to perceive the illusion of being touched on the rubber hand, as if it were part of their body. The predictive coding theory suggests that the RHI emerges to settle the conflict, attenuating somatosensory inputs in favour of visual ones, which "capture" tactile sensations. Here, we employed the psychophysical measure of perceptual threshold to measure a behavioural correlate of the somatosensory and visual modulations, to better understand the mechanisms underpinning the illusion. Before and after the RHI, participants underwent a tactile (Experiment 1) and a visual (Experiment 2) task, wherein they had to detect stimuli slightly above the perceptual threshold. According to the predictive coding framework, we found a significant decrease of tactile detection (i.e. increased tactile perceptual threshold) and a significant increase of visual detection (i.e. decreased visual perceptual threshold), suggesting a diametrical modulation of somatosensory and visual perceptual processes. These findings provide evidence of how our system plastically adapts to uncertainty, attributing different weights to sensory inputs to restore a coherent representation of the own body.
Asunto(s)
Ilusiones , Percepción del Tacto , Imagen Corporal , Mano , Humanos , Propiocepción , Tacto , Percepción VisualRESUMEN
The peripersonal space (PPS) is a special portion of space immediately surrounding the body, where the integration between tactile stimuli delivered on the body and auditory or visual events emanating from the environment occurs. Interestingly, PPS can widen if a tool is employed to interact with objects in the far space. However, electrophysiological evidence of such tool-use dependent plasticity in the human brain is scarce. Here, in a series of three experiments, participants were asked to respond to tactile stimuli, delivered to their right hand, either in isolation (unimodal condition) or combined with auditory stimulation, which could occur near (bimodal-near) or far from the stimulated hand (bimodal-far). According to multisensory integration spatial rule, when bimodal stimuli are presented at the same location, we expected a response enhancement (response time - RT - facilitation and event-related potential - ERP - super-additivity). In Experiment 1, we verified that RT facilitation was driven by bimodal input spatial congruency, independently from auditory stimulus intensity. In Experiment 2, we showed that our bimodal task was effective in eliciting the magnification of ERPs in bimodal conditions, with significantly larger responses in the near as compared to far condition. In Experiment 3 (main experiment), we explored tool-use driven PPS plasticity. Our audio-tactile task was performed either following tool-use (a 20-min reaching task, performed using a 145 cm-long rake) or after a control cognitive training (a 20-min visual discrimination task) performed in the far space. Following the control training, faster RTs and greater super-additive ERPs were found in bimodal-near as compared to bimodal-far condition (replicating Experiment 2 results). Crucially, this far-near differential response was significantly reduced after tool-use. Altogether our results indicate a selective effect of tool-use remapping in extending the boundaries of PPS. The present finding might be considered as an electrophysiological evidence of tool-use dependent plasticity in the human brain.
Asunto(s)
Comportamiento del Uso de la Herramienta , Percepción del Tacto , Humanos , Espacio Personal , Tiempo de Reacción , Percepción Espacial , TactoRESUMEN
The ability to identify our own body and its boundaries is crucial for survival. Ideally, the sooner we learn to discriminate external stimuli occurring close to our body from those occurring far from it, the better (and safer) we may interact with the sensory environment. However, when this mechanism emerges within ontogeny is unknown. Is it something acquired throughout infancy, or is it already present soon after birth? The presence of a spatial modulation of multisensory integration (MSI) is considered a hallmark of a functioning representation of the body position in space. Here, we investigated whether MSI is present and spatially organized in 18- to 92-h-old newborns. We compared electrophysiological responses to tactile stimulation when concurrent auditory events were delivered close to, as opposed to far from, the body in healthy newborns and in a control group of adult participants. In accordance with previous studies, adult controls showed a clear spatial modulation of MSI, with greater superadditive responses for multisensory stimuli close to the body. In newborns, we demonstrated the presence of a genuine electrophysiological pattern of MSI, with older newborns showing a larger MSI effect. Importantly, as for adults, multisensory superadditive responses were modulated by the proximity to the body. This finding may represent the electrophysiological mechanism responsible for a primitive coding of bodily self boundaries, thus suggesting that even just a few hours after birth, human newborns identify their own body as a distinct entity from the environment.
Asunto(s)
Encéfalo/fisiología , Fenómenos Electrofisiológicos , Estimulación Física , Percepción Espacial/fisiología , Electroencefalografía , Humanos , Recién Nacido , Aprendizaje , Tiempo de ReacciónRESUMEN
The human face is one of the most salient stimuli in the environment. It has been suggested that even basic face-like configurations (three dots composing a downward pointing triangle) may convey salience. Interestingly, stimulus salience can be signaled by mismatch detection phenomena, characterized by greater amplitudes of event-related potentials (ERPs) in response to relevant novel stimulation as compared to non-relevant repeated events. Here, we investigate whether basic face-like stimuli are salient enough to modulate mismatch detection phenomena. ERPs are elicited by a pair of sequentially presented visual stimuli (S1-S2), delivered at a constant 1-s interval, representing either a face-like stimulus (Upright configuration) or three neutral configurations (Inverted, Leftwards, and Rightwards configurations), that are obtained by rotating the Upright configuration along the three different axes. In pairs including a canonical face-like stimulus, we observe a more effective mismatch detection mechanism, with significantly larger N270 and P300 components when S2 is different from S1 as compared to when S2 is identical to S1. This ERP modulation, not significant in pairs excluding face-like stimuli, reveals that mismatch detection phenomena are significantly affected by basic face-like configurations. Even though further experiments are needed to ascertain whether this effect is specifically elicited by face-like configuration rather than by particular orientation changes, our findings suggest that face essential, structural attributes are salient enough to affect change detection processes.
Asunto(s)
Potenciales Evocados , Cara , Electroencefalografía , Humanos , Orientación , Estimulación Luminosa , Tiempo de ReacciónRESUMEN
The ability to identify our own body is considered a pivotal marker of self-awareness. Previous research demonstrated that subjects are more efficient in the recognition of images representing self rather than others' body effectors (self-advantage). Here, we verified whether, at an electrophysiological level, bodily-self recognition modulates change detection responses. In a first EEG experiment (discovery sample), event-related potentials (ERPs) were elicited by a pair of sequentially presented visual stimuli (vS1; vS2), representing either the self-hand or other people's hands. In a second EEG experiment (replicating sample), together with the previously described visual stimuli, also a familiar hand was presented. Participants were asked to decide whether vS2 was identical or different from vS1. Accuracy and response times were collected. In both experiments, results confirmed the presence of the self-advantage: participants responded faster and more accurately when the self-hand was presented. ERP results paralleled behavioral findings. Anytime the self-hand was presented, we observed significant change detection responses, with a larger N270 component for vS2 different rather than identical to vS1. Conversely, when the self-hand was not included, and even in response to the familiar hand in Experiment 2, we did not find any significant modulation of the change detection responses. Overall our findings, showing behavioral self-advantage and the selective modulation of N270 for the self-hand, support the existence of a specific mechanism devoted to bodily-self recognition, likely relying on the multimodal (visual and sensorimotor) dimension of the bodily-self representation. We propose that such a multimodal self-representation may activate the salience network, boosting change detection effects specifically for the self-hand.
Asunto(s)
Mano , Reconocimiento en Psicología , Potenciales Evocados , Humanos , Tiempo de ReacciónRESUMEN
The effect of long-term immobilization on the motor system has been described during motor preparation, imagination or execution, when the movement has to be performed. But, what happens when the movement has to be suppressed? Does long-term limb immobilization modulate physiological responses underlying motor inhibition? Event-related potentials (ERPs) were recorded in healthy participants performing a Go/Nogo task, either with both hands free to respond (T1/T4: before/after the immobilization) or when left-hand movements were prevented by a cast (T2: as soon as the cast was positioned; T3: after one week of immobilization). In the right (control) side, N140, N2, and P3 components showed the expected greater amplitude in Nogo than in Go trials, irrespective of the timepoint. On the contrary, in the left (manipulated) side, each component of the ERP responses to Nogo trials showed specific differences across timepoints, suggesting that the inhibition-related EEG activity is significantly reduced by the presence of the cast and the duration of the immobilization. Furthermore, inhibition-related theta band activity to Nogo stimuli decreased at post-immobilization blocked session (T3-blocked). Altogether these findings can be interpreted as a consequence of the plastic changes induced by the immobilization, as also demonstrated by the cast-related corticospinal excitability modulation (investigated by using TMS) and by the decreased beta band in response to Go and Nogo trials. Thus, only if we are free to move, then inhibitory responses are fully implemented. After one week of immobilization, the amount of inhibition necessary to block the movement is lower and, consequently, inhibitory-related responses are reduced.