Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 75
1.
Brain Sci ; 14(4)2024 Apr 20.
Article En | MEDLINE | ID: mdl-38672051

The relationship between cerebral rhythms and early sensorimotor development is not clear. In recent decades, evidence revealed a rhythmic modulation involving sensorimotor processing. A widely corroborated functional role of oscillatory activity is to coordinate the information flow across sensorimotor networks. Their activity is coordinated by event-related synchronisation and desynchronisation in different sensorimotor rhythms, which indicate parallel processes may be occurring in the neuronal network during movement. To date, the dynamics of these brain oscillations and early sensorimotor development are unexplored. Our study investigates the relationship between the cerebral rhythms using EEG and a typical rhythmic movement of infants, the non-nutritive sucking (NNS) behaviour. NNS is an endogenous behaviour that originates from the suck central pattern generator in the brainstem. We find, in 17 infants, that sucking frequency correlates with beta synchronisation within the sensorimotor area in two phases: one strongly anticipating (~3 s) and the other encompassing the start of the motion. These findings suggest that a beta synchronisation of the sensorimotor cortex may influence the sensorimotor dynamics of NNS activity. Our results reveal the importance of rapid brain oscillations in infants and the role of beta synchronisation and their possible role in the communication between cortical and deep generators.

2.
Curr Biol ; 34(6): R235-R236, 2024 03 25.
Article En | MEDLINE | ID: mdl-38531313

An important task for the visual system is to identify and segregate objects from background. Figure-ground illusions, such as Edgar Rubin's bistable 'vase-faces illusion'1, make the point clearly: we see either a central vase or lateral faces, alternating spontaneously, but never both images simultaneously. The border is perceptually assigned to either faces or vase, which become figure, the other shapeless background2. The stochastic alternation between figure and ground probably reflects mutual inhibitory processes that ensure a single perceptual outcome3. Which shape dominates perception depends on many factors, such as size, symmetry, convexity, enclosure, and so on, as well as attention and intention4. Here we show that the assignment of the visual border can be strongly influenced by auditory input, far more than is possible by voluntary intention. VIDEO ABSTRACT.


Illusions , Pattern Recognition, Visual , Humans , Photic Stimulation/methods , Attention , Face
3.
Neuroimage ; 286: 120508, 2024 Feb 01.
Article En | MEDLINE | ID: mdl-38181867

Sleep plays a crucial role in brain development, sensory information processing, and consolidation. Sleep spindles are markers of these mechanisms as they mirror the activity of the thalamocortical circuits. Spindles can be subdivided into two groups, slow (10-13 Hz) and fast (13-16 Hz), which are each associated with different functions. Specifically, fast spindles oscillate in the high-sigma band and are associated with sensorimotor processing, which is affected by visual deprivation. However, how blindness influences spindle development has not yet been investigated. We recorded nap video-EEG of 50 blind/severely visually impaired (BSI) and 64 sighted children aged 5 months to 6 years old. We considered aspects of both macro- and micro-structural spindles. The BSI children lacked the evolution of developmental spindles within the central area. Specifically, young BSI children presented low central high-sigma and high-beta (25-30 Hz) event-related spectral perturbation and showed no signs of maturational decrease. High-sigma and high-beta activity in the BSI group correlated with clinical indices predicting perceptual and motor disorders. Our findings suggest that fast spindles are pivotal biomarkers for identifying an early developmental deviation in BSI children. These findings are critical for initial therapeutic intervention.


Brain , Sleep , Child , Humans , Electroencephalography , Cognition , Blindness , Sleep Stages
4.
J Exp Child Psychol ; 238: 105774, 2024 02.
Article En | MEDLINE | ID: mdl-37703720

Cross-sectioning is a shape understanding task where the participants must infer and interpret the spatial features of three-dimensional (3D) solids by depicting their internal two-dimensional (2D) arrangement. An increasing body of research provides evidence of the crucial role of sensorimotor experience in acquiring these complex geometrical concepts. Here, we focused on how cross-sectioning ability emerges in young children and the influence of multisensory visuo-haptic experience in geometrical learning through two experiments. In Experiment 1, we compared the 3D printed version of the Santa Barbara Solids Test (SBST) with its classical paper version; in Experiment 2, we contrasted the children's performance in the SBST before and after the visual or visuo-haptic experience. In Experiment 1, we did not identify an advantage in visualizing 3D shapes over the classical 2D paper test. In contrast, in Experiment 2, we found that children who had the experience of a combination of visual and tactile information during the exploration phase improved their performance in the SBST compared with children who were limited to visual exploration. Our study demonstrates how practicing novel multisensory strategies improves children's understanding of complex geometrical concepts. This outcome highlights the importance of introducing multisensory experience in educational training and the need to make way for developing new technologies that could improve learning abilities in children.


Touch Perception , Visual Perception , Child , Humans , Child, Preschool , Haptic Technology , Touch , Learning
5.
Front Neurosci ; 17: 1267700, 2023.
Article En | MEDLINE | ID: mdl-37954876

Introduction: The ability to process sensory information is an essential adaptive function, and hyper- or hypo-sensitive maladaptive profiles of responses to environmental stimuli generate sensory processing disorders linked to cognitive, affective, and behavioral alterations. Consequently, assessing sensory processing profiles might help research the vulnerability and resilience to mental disorders. The research on neuroradiological correlates of the sensory processing profiles is mainly limited to the young-age population or neurodevelopmental disorders. So, this study aims to examine the structural MRI correlates of sensory profiles in a sample of typically developed adults. Methods: We investigated structural cortical thickness (CT) and white matter integrity, through Diffusion Tensor Imaging (DTI), correlates of Adolescent/Adult Sensory Profile (AASP) questionnaire subscales in 57 typical developing subjects (34F; mean age: 32.7 ± 9.3). Results: We found significant results only for the sensation seeking (STS) subscale. Positive and negative correlations emerged with fractional anisotropy (FA) and radial diffusivity (RD) in anterior thalamic radiation, optic radiation, superior longitudinal fasciculus, corpus callosum, and the cingulum bundle. No correlation between sensation seeking and whole brain cortical thickness was found. Discussion: Overall, our results suggest a positive correlation between sensation seeking and higher white matter structural integrity in those tracts mainly involved in visuospatial processing but no correlation with gray matter structure. The enhanced structural integrity associated with sensation seeking may reflect a neurobiological substrate linked to active research of sensory stimuli and resilience to major psychiatric disorders like schizophrenia, bipolar disorder, and depression.

6.
Sci Rep ; 13(1): 16553, 2023 10 02.
Article En | MEDLINE | ID: mdl-37783746

When we perform an action, self-elicited movement induces suppression of somatosensory information to the cortex, requiring a correct motor-sensory and inter-sensory (i.e. cutaneous senses, kinesthesia, and proprioception) integration processes to be successful. However, recent works show that blindness might impact some of these elements. The current study investigates the effect of movement on tactile perception and the role of vision in this process. We measured the velocity discrimination threshold in 18 sighted and 18 blind individuals by having them perceive a sequence of two movements and discriminate the faster one in passive and active touch conditions. Participants' Just Noticeable Difference (JND) was measured to quantify their precision. Results showed a generally worse performance during the active touch condition compared to the passive. In particular, this difference was significant in the blind group, regardless of the blindness duration, but not in the sighted one. These findings suggest that the absence of visual calibration impacts motor-sensory and inter-sensory integration required during movement, diminishing the reliability of tactile signals in blind individuals. Our work spotlights the need for intervention in this population and should be considered in the sensory substitution/reinforcement device design.


Touch Perception , Touch , Humans , Reproducibility of Results , Blindness , Movement
7.
Front Hum Neurosci ; 17: 1000832, 2023.
Article En | MEDLINE | ID: mdl-37007684

Introduction: Position sense, which belongs to the sensory stream called proprioception, is pivotal for proper movement execution. Its comprehensive understanding is needed to fill existing knowledge gaps in human physiology, motor control, neurorehabilitation, and prosthetics. Although numerous studies have focused on different aspects of proprioception in humans, what has not been fully investigated so far are the neural correlates of proprioceptive acuity at the joints. Methods: Here, we implemented a robot-based position sense test to elucidate the correlation between patterns of neural activity and the degree of accuracy and precision exhibited by the subjects. Eighteen healthy participants performed the test, and their electroencephalographic (EEG) activity was analyzed in its µ band (8-12 Hz), as the frequency band related to voluntary movement and somatosensory stimulation. Results: We observed a significant positive correlation between the matching error, representing proprioceptive acuity, and the strength of the activation in contralateral hand motor and sensorimotor areas (left central and central-parietal areas). In absence of visual feedback, these same regions of interest (ROIs) presented a higher activation level compared to the association and visual areas. Remarkably, central and central-parietal activation was still observed when visual feedback was added, although a consistent activation in association and visual areas came up. Conclusion: Summing up, this study supports the existence of a specific link between the magnitude of activation of motor and sensorimotor areas related to upper limb proprioceptive processing and the proprioceptive acuity at the joints.

8.
PLoS One ; 18(3): e0280987, 2023.
Article En | MEDLINE | ID: mdl-36888612

Our brain constantly combines sensory information in unitary percept to build coherent representations of the environment. Even though this process could appear smooth, integrating sensory inputs from various sensory modalities must overcome several computational issues, such as recoding and statistical inferences problems. Following these assumptions, we developed a neural architecture replicating humans' ability to use audiovisual spatial representations. We considered the well-known ventriloquist illusion as a benchmark to evaluate its phenomenological plausibility. Our model closely replicated human perceptual behavior, proving a truthful approximation of the brain's ability to develop audiovisual spatial representations. Considering its ability to model audiovisual performance in a spatial localization task, we release our model in conjunction with the dataset we recorded for its validation. We believe it will be a powerful tool to model and better understand multisensory integration processes in experimental and rehabilitation environments.


Illusions , Visual Perception , Humans , Auditory Perception , Brain , Computer Simulation , Acoustic Stimulation , Photic Stimulation
9.
Hum Brain Mapp ; 44(2): 656-667, 2023 02 01.
Article En | MEDLINE | ID: mdl-36169038

Clear evidence demonstrated a supramodal organization of sensory cortices with multisensory processing occurring even at early stages of information encoding. Within this context, early recruitment of sensory areas is necessary for the development of fine domain-specific (i.e., spatial or temporal) skills regardless of the sensory modality involved, with auditory areas playing a crucial role in temporal processing and visual areas in spatial processing. Given the domain-specificity and the multisensory nature of sensory areas, in this study, we hypothesized that preferential domains of representation (i.e., space and time) of visual and auditory cortices are also evident in the early processing of multisensory information. Thus, we measured the event-related potential (ERP) responses of 16 participants while performing multisensory spatial and temporal bisection tasks. Audiovisual stimuli occurred at three different spatial positions and time lags and participants had to evaluate whether the second stimulus was spatially (spatial bisection task) or temporally (temporal bisection task) farther from the first or third audiovisual stimulus. As predicted, the second audiovisual stimulus of both spatial and temporal bisection tasks elicited an early ERP response (time window 50-90 ms) in visual and auditory regions. However, this early ERP component was more substantial in the occipital areas during the spatial bisection task, and in the temporal regions during the temporal bisection task. Overall, these results confirmed the domain specificity of visual and auditory cortices and revealed that this aspect selectively modulates also the cortical activity in response to multisensory stimuli.


Auditory Perception , Visual Perception , Humans , Visual Perception/physiology , Auditory Perception/physiology , Parietal Lobe , Evoked Potentials , Temporal Lobe , Acoustic Stimulation , Photic Stimulation
10.
Sci Rep ; 12(1): 19036, 2022 11 09.
Article En | MEDLINE | ID: mdl-36351944

It is evident that the brain is capable of large-scale reorganization following sensory deprivation, but the extent of such reorganization is to date, not clear. The auditory modality is the most accurate to represent temporal information, and deafness is an ideal clinical condition to study the reorganization of temporal representation when the audio signal is not available. Here we show that hearing, but not deaf individuals, show a strong ERP response to visual stimuli in temporal areas during a time-bisection task. This ERP response appears 50-90 ms after the flash and recalls some aspects of the N1 ERP component usually elicited by auditory stimuli. The same ERP is not evident for a visual space-bisection task, suggesting that the early recruitment of temporal cortex is specific for building a highly resolved temporal representation within the visual modality. These findings provide evidence that the lack of auditory input can interfere with typical development of complex visual temporal representations.


Auditory Cortex , Deafness , Humans , Photic Stimulation , Magnetic Resonance Imaging , Hearing , Brain Mapping , Auditory Cortex/physiology
11.
Neuropsychologia ; 176: 108391, 2022 11 05.
Article En | MEDLINE | ID: mdl-36209890

Vision plays a pivotal role in the development of spatial representation. When visual feedback is absent, complex spatial representations are impaired and temporal properties of auditory information are used by blind people to build spatial maps. Specifically, late blind (LB) adults that have spent more than 20 years without vision (i.e., long-term LB) represent space based on temporal cues. In the present study, we investigate whether audio-motor training based on body feedback modifies the way in which long-term LB adults create spatial representations of the environment. Three long-term LB adults performed a battery of spatial tasks before and after four weeks of training, while three long-term LB adults performed the same tasks before and after four weeks without attending any training. Tasks included: i) an EEG recording during a spatial bisection task with coherent or conflicting spatiotemporal information, ii) auditory vertical and horizontal localization paradigms where participants indicated the final position of a moving sound source, iii) proprioceptive-motor paradigms where participants discriminated the end point of arm movements. The training consisted of specific exercises based on upper-limb movements with auditory feedback from a bracelet device and auditory paths. Our findings suggest that training produces a beneficial effect on some spatial competencies and tends to induce a cortical reorganization of occipital areas sensitive to spatial instead of temporal coordinates of sounds.


Blindness , Sound Localization , Adult , Humans , Vision, Ocular , Feedback, Sensory , Cues , Movement
12.
Transl Psychiatry ; 12(1): 331, 2022 08 12.
Article En | MEDLINE | ID: mdl-35961974

It has been widely demonstrated that time processing is altered in patients with schizophrenia. This perspective review delves into such temporal deficit and highlights its link to low-level sensory alterations, which are often overlooked in rehabilitation protocols for psychosis. However, if temporal impairment at the sensory level is inherent to the disease, new interventions should focus on this dimension. Beyond more traditional types of intervention, here we review the most recent digital technologies for rehabilitation and the most promising ones for sensory training. The overall aim is to synthesise existing literature on time in schizophrenia linking psychopathology, psychophysics, and technology to help future developments.


Psychotic Disorders , Schizophrenia , Humans , Psychopathology , Psychophysics , Technology
13.
Children (Basel) ; 9(7)2022 Jul 15.
Article En | MEDLINE | ID: mdl-35884039

Spatial representation is a crucial skill for everyday interaction with the environment. Different factors seem to influence spatial perception, such as body movements and vision. However, it is still unknown if motor impairment affects the building of simple spatial perception. To investigate this point, we tested hemiplegic children with (HV) and without visual field (H) disorders in an auditory and visual-spatial localization and pitch discrimination task. Fifteen hemiplegic children (nine H and six HV) and twenty with typical development took part in the experiment. The tasks consisted in listening to a sound coming from a series of speakers positioned at the front or back of the subject. In one condition, subjects were asked to discriminate the pitch, while in the other, subjects had to localize the position of the sound. We also replicated the spatial task in a visual modality. Both groups of hemiplegic children performed worse in the auditory spatial localization task compared with the control, while no difference was found in the pitch discrimination task. For the visual-spatial localization task, only HV children differed from the two other groups. These results suggest that movement is important for the development of auditory spatial representation.

14.
J Clin Sleep Med ; 18(8): 2051-2062, 2022 08 01.
Article En | MEDLINE | ID: mdl-35499135

The mechanisms involved in the origin of dreams remain one of the great unknowns in science. In the 21st century, studies in the field have focused on 3 main topics: functional networks that underlie dreaming, neural correlates of dream contents, and signal propagation. We review neuroscientific studies about dreaming processes, focusing on their cortical correlations. The involvement of frontoparietal regions in the dream-retrieval process allows us to discuss it in light of the Global Workspace theory of consciousness. However, dreaming in distinct sleep stages maintains relevant differences, suggesting that multiple generators are implicated. Then, given the strong influence of light perception on sleep regulation and the mostly visual content of dreams, we investigate the effect of blindness on the organization of dreams. Blind individuals represent a worthwhile population to clarify the role of perceptual systems in dream generation, and to make inferences about their top-down and/or bottom-up origin. Indeed, congenitally blind people maintain the ability to produce visual dreams, suggesting that bottom-up mechanisms could be associated with innate body schemes or multisensory integration processes. Finally, we propose the new dream-engineering technique as a tool to clarify the mechanisms of multisensory integration during sleep and related mental activity, presenting possible implications for rehabilitation in sensory-impaired individuals. The Theory of Proto-consciousness suggests that the interaction of brain states underlying waking and dreaming ensures the optimal functioning of both. Therefore, understanding the origin of dreams and capabilities of our brain during a dreamlike state, we could introduce it as a rehabilitative tool. CITATION: Vitali H, Campus C, De Giorgis V, Signorini S, Gori M. The vision of dreams: from ontogeny to dream engineering in blindness. J Clin Sleep Med. 2022;18(8):2051-2062.


Dreams , Sleep , Blindness , Brain , Dreams/physiology , Humans
15.
Hear Res ; 417: 108468, 2022 04.
Article En | MEDLINE | ID: mdl-35220107

The distance of sound sources relative to the body can be estimated using acoustic level and direct-to-reverberant ratio cues. However, the ability to do this may differ for sounds that are in front compared to behind the listener. One reason for this is that vision, which plays an important role in calibrating auditory distance cues early in life, is unavailable for rear space. Furthermore, the filtering of sounds by the pinnae differs if they originate from the front compared to the back. We investigated auditory distance discrimination in front and rear space by comparing performance for auditory spatial bisection of distance and minimum audible distance discrimination (MADD) tasks. In the bisection task, participants heard three successive bursts of noise at three different distances and indicated whether the second sound (probe) was closer in space to the first or third sound (references). In the MADD task, participants reported which of two successive sounds was closer. An analysis of variance with factors task and region of space showed worse performance for rear than for front space, but no significant interaction between task and region of space. For the bisection task, the point of subjective equality (PSE) was slightly biased towards the body, but the absolute magnitude of the PSE did not differ between front and rear space. These results are consistent with the hypothesis that visual information is important in calibrating the auditory representation of front space in distance early in life.


Distance Perception , Sound Localization , Acoustic Stimulation , Auditory Perception , Cues , Humans , Sound , Space Perception
16.
Brain Res ; 1776: 147744, 2022 02 01.
Article En | MEDLINE | ID: mdl-34848173

When a brief flash is quickly presented aligned with a moving target, the flash typically appears to lag behind the moving stimulus. This effect is widely known in the literature as a flash-lag illusion (FLI). The flash-lag is an example of a motion-induced position shift. Since auditory deprivation leads to both enhanced visual skills and impaired temporal abilities, both crucial for the perception of the flash-lag effect, here we hypothesized that lack of audition could influence the FLI. 13 early deaf and 18 hearing individuals were tested in a visual FLI paradigm to investigate this hypothesis. As expected, results demonstrated a reduction of the flash-lag effect following early deafness, both in the central and peripheral visual fields. Moreover, only for deaf individuals, there is a positive correlation between the flash-lag effect in the peripheral and central visual field, suggesting that the mechanisms underlying the effect in the center of the visual field expand to the periphery following deafness. Overall, these findings reveal that lack of audition early in life profoundly impacts early visual processing underlying the flash-lag effect.


Deafness/physiopathology , Illusions/physiology , Visual Perception/physiology , Adult , Female , Humans , Male , Middle Aged , Reaction Time/physiology , Visual Fields/physiology , Young Adult
17.
Annu Int Conf IEEE Eng Med Biol Soc ; 2021: 4781-4786, 2021 11.
Article En | MEDLINE | ID: mdl-34892280

The present work aims to introduce a novel robotic platform suitable for investigating perception in multi-sensory motion tasks for individuals with and without sensory and motor disabilities. The system, called RoMAT, allows the study of how multisensory signals are integrated, taking into account the speed and direction of the stimuli. It is a robotic platform composed of a visual and tactile wheel mounted on two routable plates to be moved under the finger and the visual observation of the participants. We validated the system by implementing a rotation discrimination task considering two different sensory modalities: vision, touch and multisensory visual-tactile integration. Four healthy subjects were asked to report the length of motion rotation after perceiving a moving stimulus generated by the visual, tactile, or both stimuli. Results suggest that multisensory precision improves when multiple sensory stimulations are presented. The new system can therefore provide fundamental inputs in determining the perceptual principles of motion processing. Therefore, this device can be a potential system to design screening and rehabilitation protocols based on neuroscientific findings to be used in individuals with visual and motor impairments.Clinical relevance- This research presents a novel robotic motion simulator to deliver combined or independent stimulation of the visual and tactile sensory signals.


Robotics , Touch Perception , Humans , Motion , Touch , Visual Perception
18.
PLoS One ; 16(9): e0257676, 2021.
Article En | MEDLINE | ID: mdl-34551010

Multisensory experience is crucial for developing a coherent perception of the world. In this context, vision and audition are essential tools to scaffold spatial and temporal representations, respectively. Since speed encompasses both space and time, investigating this dimension in blindness allows deepening the relationship between sensory modalities and the two representation domains. In the present study, we hypothesized that visual deprivation influences the use of spatial and temporal cues underlying acoustic speed perception. To this end, ten early blind and ten blindfolded sighted participants performed a speed discrimination task in which spatial, temporal, or both cues were available to infer moving sounds' velocity. The results indicated that both sighted and early blind participants preferentially relied on temporal cues to determine stimuli speed, by following an assumption that identified as faster those sounds with a shorter duration. However, in some cases, this temporal assumption produces a misperception of the stimulus speed that negatively affected participants' performance. Interestingly, early blind participants were more influenced by this misleading temporal assumption than sighted controls, resulting in a stronger impairment in the speed discrimination performance. These findings demonstrate that the absence of visual experience in early life increases the auditory system's preference for the time domain and, consequentially, affects the perception of speed through audition.


Auditory Perception , Visually Impaired Persons , Adult , Humans , Male , Sound Localization
19.
Curr Biol ; 31(22): 5093-5101.e5, 2021 11 22.
Article En | MEDLINE | ID: mdl-34555348

Congenitally blind infants are not only deprived of visual input but also of visual influences on the intact senses. The important role that vision plays in the early development of multisensory spatial perception1-7 (e.g., in crossmodal calibration8-10 and in the formation of multisensory spatial representations of the body and the world1,2) raises the possibility that impairments in spatial perception are at the heart of the wide range of difficulties that visually impaired infants show across spatial,8-12 motor,13-17 and social domains.8,18,19 But investigations of early development are needed to clarify how visually impaired infants' spatial hearing and touch support their emerging ability to make sense of their body and the outside world. We compared sighted (S) and severely visually impaired (SVI) infants' responses to auditory and tactile stimuli presented on their hands. No statistically reliable differences in the direction or latency of responses to auditory stimuli emerged, but significant group differences emerged in responses to tactile and audiotactile stimuli. The visually impaired infants showed attenuated audiotactile spatial integration and interference, weighted more tactile than auditory cues when the two were presented in conflict, and showed a more limited influence of representations of the external layout of the body on tactile spatial perception.20 These findings uncover a distinct phenotype of multisensory spatial perception in early postnatal visual deprivation. Importantly, evidence of audiotactile spatial integration in visually impaired infants, albeit to a lesser degree than in sighted infants, signals the potential of multisensory rehabilitation methods in early development. VIDEO ABSTRACT.


Space Perception , Touch Perception , Auditory Perception/physiology , Blindness , Hand/physiology , Humans , Space Perception/physiology , Touch/physiology , Touch Perception/physiology , Visual Perception
20.
Exp Brain Res ; 239(10): 3123-3132, 2021 Oct.
Article En | MEDLINE | ID: mdl-34415367

The human brain creates an external world representation based on magnitude judgments by estimating distance, numerosity, or size. The magnitude and spatial representation are hypothesized to rely on common mechanisms shared by different sensory modalities. We explored the relationship between magnitude and spatial representation using two different sensory systems. We hypothesize that the interaction between space and magnitude is combined differently depending on sensory modalities. Furthermore, we aimed to understand the role of the spatial reference frame in magnitude representation. We used stimulus-response compatibility (SRC) to investigate these processes assuming that performance is improved if stimulus and response share common features. We designed an auditory and tactile SRC task with conflicting spatial and magnitude mapping. Our results showed that sensory modality modulates the relationship between space and magnitude. A larger effect of magnitude over spatial congruency occurred in a tactile task. However, magnitude and space showed similar weight in the auditory task, with neither spatial congruency nor magnitude congruency having a significant effect. Moreover, we observed that the spatial frame activated during tasks was elicited by the sensory inputs. The participants' performance was reversed in the tactile task between uncrossed and crossed hands posture, suggesting an internal coordinate system. In contrast, crossing the hands did not alter performance (i.e., using an allocentric frame of reference). Overall, these results suggest that space and magnitude interaction differ in auditory and tactile modalities, supporting the idea that these sensory modalities use different magnitude and spatial representation mechanisms.


Touch Perception , Hand , Humans , Judgment , Posture , Space Perception , Touch
...