Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 1.043
Filter
1.
Article in English | MEDLINE | ID: mdl-39350457

ABSTRACT

INTRODUCTION: Individuals with schizophrenia present anomalies in the extension and plasticity of the peripersonal space (PPS), the section of space surrounding the body, shaped through motor experiences. A weak multisensory integration in PPS would contribute to an impairment of self-embodiment processing, a core feature of the disorder linked to specific subjective experiences. In this exploratory study, we aimed at: (1) testing an association between PPS features, psychopathology, and subjective experiences in schizophrenia; (2) describing the PPS profile in individuals with early-onset schizophrenia. MATERIALS AND METHODS: Twenty-seven individuals with schizophrenia underwent a task measuring the PPS size and boundaries demarcation before and after a motor training with a tool. The Positive And Negative Syndrome Scale (PANSS), the Examination of Anomalous Self Experience scale (EASE) and the Autism Rating Scale (ARS) were used to assess psychopathology. Subsequently, participants were divided into two subgroups, early and adult-onset schizophrenia. The two groups were compared in regard to their PPS and psychopathological profiles. RESULTS: PPS patterns were associated with psychopathology, particularly positively with PANSS negative scale score, and negatively with subjective experiences of existential reorientation (EASE Domain 5 scores) and of social encounters (ARS scores). Only PPS parameters and ARS scores differentiated between early and adult-onset participants. CONCLUSIONS: Our results, although preliminary and exploratory, can suggest a link between PPS patterns, negative symptoms, and disturbances of the subjective experience, particularly in the intersubjective domain, in schizophrenia. Moreover, they seem to suggest that specific PPS profiles and schizophrenic autism traits could be markers of early-onset schizophrenia.

2.
J Neuroeng Rehabil ; 21(1): 155, 2024 Sep 09.
Article in English | MEDLINE | ID: mdl-39252006

ABSTRACT

BACKGROUND: Planning and executing movements requires the integration of different sensory modalities, such as vision and proprioception. However, neurological diseases like stroke can lead to full or partial loss of proprioception, resulting in impaired movements. Recent advances focused on providing additional sensory feedback to patients to compensate for the sensory loss, proving vibrotactile stimulation to be a viable option as it is inexpensive and easy to implement. Here, we test how such vibrotactile information can be integrated with visual signals to estimate the spatial location of a reach target. METHODS: We used a center-out reach paradigm with 31 healthy human participants to investigate how artificial vibrotactile stimulation can be integrated with visual-spatial cues indicating target location. Specifically, we provided multisite vibrotactile stimulation to the moving dominant arm using eccentric rotating mass (ERM) motors. As the integration of inputs across multiple sensory modalities becomes especially relevant when one of them is uncertain, we additionally modulated the reliability of visual cues. We then compared the weighing of vibrotactile and visual inputs as a function of visual uncertainty to predictions from the maximum likelihood estimation (MLE) framework to decide if participants achieve quasi-optimal integration. RESULTS: Our results show that participants could estimate target locations based on vibrotactile instructions. After short training, combined visual and vibrotactile cues led to higher hit rates and reduced reach errors when visual cues were uncertain. Additionally, we observed lower reaction times in trials with low visual uncertainty when vibrotactile stimulation was present. Using MLE predictions, we found that integration of vibrotactile and visual cues followed optimal integration when vibrotactile cues required the detection of one or two active motors. However, if estimating the location of a target required discriminating the intensities of two cues, integration violated MLE predictions. CONCLUSION: We conclude that participants can quickly learn to integrate visual and artificial vibrotactile information. Therefore, using additional vibrotactile stimulation may serve as a promising way to improve rehabilitation or the control of prosthetic devices by patients suffering loss of proprioception.


Subject(s)
Cues , Psychomotor Performance , Vibration , Visual Perception , Humans , Male , Female , Adult , Visual Perception/physiology , Psychomotor Performance/physiology , Young Adult , Feedback, Sensory/physiology , Proprioception/physiology , Touch Perception/physiology , Uncertainty , Physical Stimulation/methods , Space Perception/physiology , Movement/physiology
3.
Sci Rep ; 14(1): 20923, 2024 09 09.
Article in English | MEDLINE | ID: mdl-39251764

ABSTRACT

Does congruence between auditory and visual modalities affect aesthetic experience? While cross-modal correspondences between vision and hearing are well-documented, previous studies show conflicting results regarding whether audiovisual correspondence affects subjective aesthetic experience. Here, in collaboration with the Kentler International Drawing Space (NYC, USA), we depart from previous research by using music specifically composed to pair with visual art in the professionally-curated Music as Image and Metaphor exhibition. Our pre-registered online experiment consisted of 4 conditions: Audio, Visual, Audio-Visual-Intended (artist-intended pairing of art/music), and Audio-Visual-Random (random shuffling). Participants (N = 201) were presented with 16 pieces and could click to proceed to the next piece whenever they liked. We used time spent as an implicit index of aesthetic interest. Additionally, after each piece, participants were asked about their subjective experience (e.g., feeling moved). We found that participants spent significantly more time with Audio, followed by Audiovisual, followed by Visual pieces; however, they felt most moved in the Audiovisual (bi-modal) conditions. Ratings of audiovisual correspondence were significantly higher for the Audiovisual-Intended compared to Audiovisual-Random condition; interestingly, though, there were no significant differences between intended and random conditions on any other subjective rating scale, or for time spent. Collectively, these results call into question the relationship between cross-modal correspondence and aesthetic appreciation. Additionally, the results complicate the use of time spent as an implicit measure of aesthetic experience.


Subject(s)
Auditory Perception , Esthetics , Music , Visual Perception , Humans , Music/psychology , Female , Esthetics/psychology , Male , Adult , Visual Perception/physiology , Auditory Perception/physiology , Young Adult , Art , Photic Stimulation , Acoustic Stimulation , Adolescent
4.
J Exp Child Psychol ; 247: 106040, 2024 Nov.
Article in English | MEDLINE | ID: mdl-39142077

ABSTRACT

It is well-accepted that multisensory integration (MSI) undergoes protracted maturation from childhood to adulthood. However, existing evidence may have been confounded by potential age-related differences in attention. To unveil neurodevelopmental changes in MSI while matching top-down attention between children and adults, we recorded event-related potentials of healthy children aged 7 to 9 years and young adults in the visual-to-auditory attentional spreading paradigm wherein attention and MSI could be measured concurrently. The absence of children versus adults differences in the visual selection negativity component and behavioral measures of auditory interference first demonstrates that the child group could maintain top-down visual attention and ignore task-irrelevant auditory information to a similar extent as adults. Then, the stimulus-driven attentional spreading quantified by the auditory negative difference (Nd) component was found to be overall absent in the child group, revealing the children's largely immature audiovisual binding process. These findings furnish strong evidence for the protracted maturation of MSI per se from childhood to adulthood, hence providing a new benchmark for characterizing the developmental course of MSI. In addition, we also found that the representation-driven attentional spreading measured by another Nd was present but less robust in children, suggesting their substantially but not fully developed audiovisual representation coactivation process.


Subject(s)
Attention , Auditory Perception , Evoked Potentials , Visual Perception , Humans , Attention/physiology , Child , Female , Male , Auditory Perception/physiology , Visual Perception/physiology , Young Adult , Evoked Potentials/physiology , Adult , Electroencephalography , Child Development/physiology , Acoustic Stimulation , Age Factors , Photic Stimulation
5.
Res Dev Disabil ; 153: 104813, 2024 Oct.
Article in English | MEDLINE | ID: mdl-39163725

ABSTRACT

Developmental dyslexia is characterized by difficulties in learning to read, affecting cognition and causing failure at school. Interventions for children with developmental dyslexia have focused on improving linguistic capabilities (phonics, orthographic and morphological instructions), but developmental dyslexia is accompanied by a wide variety of sensorimotor impairments. The goal of this study was to examine the effects of a proprioceptive intervention on reading performance and eye movement in children with developmental dyslexia. Nineteen children diagnosed with developmental dyslexia were randomly assigned to a regular Speech Therapy (ST) or to a Proprioceptive and Speech Intervention (PSI), in which they received both the usual speech therapy and a proprioceptive intervention aimed to correct their sensorimotor impairments (prism glasses, oral neurostimulation, insoles and breathing instructions). Silent reading performance and eye movements were measured pre- and post-intervention (after nine months). In the PSI group, reading performance improved and eye movements were smoother and faster, reaching values similar to those of children with typical reading performance. The recognition of written words also improved, indicating better lexical access. These results show that PSI might constitute a valuable tool for reading improvement children with developmental dyslexia.


Subject(s)
Dyslexia , Eye Movements , Eye-Tracking Technology , Reading , Humans , Dyslexia/rehabilitation , Dyslexia/physiopathology , Dyslexia/therapy , Child , Male , Female , Eye Movements/physiology , Proprioception/physiology , Sensory Aids
6.
J Vestib Res ; 2024 Aug 14.
Article in English | MEDLINE | ID: mdl-39150839

ABSTRACT

BACKGROUND: Flight simulators have an essential role in aircrew training. Occasionally, symptoms of motion sickness, defined as simulator sickness, develop during training sessions. The reported incidence of simulator sickness ranged widely in different studies. OBJECTIVE: The aims of this study were to calculate the incidence of and to define a threshold value for simulator sickness among rotary-wing pilots using the validated Simulator Sickness Questionnaire (SSQ). METHODS: CH-53 and UH-60 helicopter pilots, who trained in helicopter simulators in the Israeli Air Force, were asked to fulfill SSQ. A score of 20 in the SSQ was defined as the threshold for simulator sickness. Simulator sickness incidence and average SSQ were calculated. Correlations between age and simulator training hours to SSQ scores were analyzed. RESULTS: A total of 207 rotary-wing aircrew participated in the study. Simulator sickness was experienced by 51.7% of trainees. The average SSQ score was 32.7. A significant negative correlation was found between age and SSQ score. CONCLUSIONS: Simulator sickness was experienced by more than half of helicopter pilots. A score of 20 in the SSQ was found to be suitable as the threshold for this condition.

7.
Acta Psychol (Amst) ; 249: 104386, 2024 Sep.
Article in English | MEDLINE | ID: mdl-39174407

ABSTRACT

Virtual Reality has significantly improved the understanding of body experience, through techniques such as Body Illusion. Body Illusion allows individuals to perceive an artificial body as their own, changing body perceptual and affective components. Prior research has predominantly focused on female participants, leaving the impact of Body Illusion on males less understood. This study seeks to fill this gap by examining the nuanced bodily experiences of men in comparison to women. 40 participants (20 females and 20 males) were proposed with visuo-tactile synchronous and asynchronous Body Illusion to explore changes in body satisfaction and body size estimation across three critical areas: shoulders, hips, and waist. Results revealed significant initial disparities, with females displaying greater body dissatisfaction and a tendency to overestimate body size. After Body Illusion, females adjusted the hips perceived body size closer to that of the virtual body and reported increased body satisfaction independent of the condition. Conversely, males showed changes only in waist size estimation only after synchronous stimulation without significant shifts in body satisfaction. These results suggest a higher sensitivity of women to embodied experiences, potentially due to societal influences and a greater inclination towards self-objectification. These insights pave the way for creating more refined and effective interventions for body image issues, highlighting the importance of incorporating gender-specific considerations in VR-based prevention and therapeutical programs.


Subject(s)
Body Image , Illusions , Virtual Reality , Humans , Male , Female , Body Image/psychology , Illusions/physiology , Adult , Young Adult , Sex Factors , Personal Satisfaction , Body Size/physiology , Sex Characteristics , Body Dissatisfaction
8.
Curr Biol ; 34(18): 4091-4103.e4, 2024 Sep 23.
Article in English | MEDLINE | ID: mdl-39216484

ABSTRACT

Male mosquitoes form aerial aggregations, known as swarms, to attract females and maximize their chances of finding a mate. Within these swarms, individuals must be able to recognize potential mates and navigate the social environment to successfully intercept a mating partner. Prior research has almost exclusively focused on the role of acoustic cues in mediating the male mosquito's ability to recognize and pursue females. However, the role of other sensory modalities in this behavior has not been explored. Moreover, how males avoid collisions with one another in the swarm while pursuing females remains poorly understood. In this study, we combined free-flight and tethered-flight simulator experiments to demonstrate that swarming Anopheles coluzzii mosquitoes integrate visual and acoustic information to track conspecifics and avoid collisions. Our tethered experiments revealed that acoustic stimuli gated mosquito steering responses to visual objects simulating nearby mosquitoes, especially in males that exhibited a strong response toward visual objects in the presence of female flight tones. Additionally, we observed that visual cues alone could trigger changes in mosquitoes' wingbeat amplitude and frequency. These findings were corroborated by our free-flight experiments, which revealed that Anopheles coluzzii modulate their thrust-based flight responses to nearby conspecifics in a similar manner to tethered animals, potentially allowing for collision avoidance within swarms. Together, these results demonstrate that both males and females integrate multiple sensory inputs to mediate swarming behavior, and for males, the change in flight kinematics in response to multimodal cues might allow them to simultaneously track females while avoiding collisions.


Subject(s)
Anopheles , Cues , Animals , Male , Anopheles/physiology , Female , Flight, Animal/physiology , Sexual Behavior, Animal/physiology , Auditory Perception/physiology , Visual Perception/physiology
9.
Front Pain Res (Lausanne) ; 5: 1414927, 2024.
Article in English | MEDLINE | ID: mdl-39119526

ABSTRACT

Our mental representation of our body depends on integrating various sensory modalities, such as tactile information. In tactile distance estimation (TDE) tasks, participants must estimate the distance between two tactile tips applied to their skin. This measure of tactile perception has been linked to body representation assessments. Studies in individuals with fibromyalgia (FM), a chronic widespread pain syndrome, suggest the presence of body representation distortions and tactile alterations, but TDE has never been examined in this population. Twenty participants with FM and 24 pain-free controls performed a TDE task on three Body regions (upper limb, trunk, lower limb), in which they manually estimated the interstimuli distance on a tablet. TDE error, the absolute difference between the estimation and the interstimuli distance, was not different between the Groups, on any Body region. Drawings of their body as they felt it revealed clear and frequent distortions of body representation in the group with FM, compared to negligible perturbations in controls. This contrast between distorted body drawings and unaltered TDE suggests a preserved integration of tactile information but an altered integration of this information with other sensory modalities to generate a precise and accurate body representation. Future research should investigate the relative contribution of each sensory information and prior knowledge about the body in body representation in individuals with FM to shed light on the observed distortions.

10.
Front Psychol ; 15: 1396946, 2024.
Article in English | MEDLINE | ID: mdl-39091706

ABSTRACT

Introduction: The prevailing theories of consciousness consider the integration of different sensory stimuli as a key component for this phenomenon to rise on the brain level. Despite many theories and models have been proposed for multisensory integration between supraliminal stimuli (e.g., the optimal integration model), we do not know if multisensory integration occurs also for subliminal stimuli and what psychophysical mechanisms it follows. Methods: To investigate this, subjects were exposed to visual (Virtual Reality) and/or haptic stimuli (Electro-Cutaneous Stimulation) above or below their perceptual threshold. They had to discriminate, in a two-Alternative Forced Choice Task, the intensity of unimodal and/or bimodal stimuli. They were then asked to discriminate the sensory modality while recording their EEG responses. Results: We found evidence of multisensory integration for supraliminal condition, following the classical optimal model. Importantly, even for subliminal trials participant's performances in the bimodal condition were significantly more accurate when discriminating the intensity of the stimulation. Moreover, significant differences emerged between unimodal and bimodal activity templates in parieto-temporal areas known for their integrative role. Discussion: These converging evidences - even if preliminary and needing confirmation from the collection of further data - suggest that subliminal multimodal stimuli can be integrated, thus filling a meaningful gap in the debate about the relationship between consciousness and multisensory integration.

11.
Front Psychol ; 15: 1353490, 2024.
Article in English | MEDLINE | ID: mdl-39156805

ABSTRACT

People can use their sense of hearing for discerning thermal properties, though they are for the most part unaware that they can do so. While people unequivocally claim that they cannot perceive the temperature of pouring water through the auditory properties of hearing it being poured, our research further strengthens the understanding that they can. This multimodal ability is implicitly acquired in humans, likely through perceptual learning over the lifetime of exposure to the differences in the physical attributes of pouring water. In this study, we explore people's perception of this intriguing cross modal correspondence, and investigate the psychophysical foundations of this complex ecological mapping by employing machine learning. Our results show that not only can the auditory properties of pouring water be classified by humans in practice, the physical characteristics underlying this phenomenon can also be classified by a pre-trained deep neural network.

12.
Hum Brain Mapp ; 45(12): e70009, 2024 Aug 15.
Article in English | MEDLINE | ID: mdl-39185690

ABSTRACT

Attention and crossmodal interactions are closely linked through a complex interplay at different stages of sensory processing. Within the context of motion perception, previous research revealed that attentional demands alter audiovisual interactions in the temporal domain. In the present study, we aimed to understand the neurophysiological correlates of these attentional modulations. We utilized an audiovisual motion paradigm that elicits auditory time interval effects on perceived visual speed. The audiovisual interactions in the temporal domain were quantified by changes in perceived visual speed across different auditory time intervals. We manipulated attentional demands in the visual field by having a secondary task on a stationary object (i.e., single- vs. dual-task conditions). When the attentional demands were high (i.e., dual-task condition), there was a significant decrease in the effects of auditory time interval on perceived visual speed, suggesting a reduction in audiovisual interactions. Moreover, we found significant differences in both early and late neural activities elicited by visual stimuli across task conditions (single vs. dual), reflecting an overall increase in attentional demands in the visual field. Consistent with the changes in perceived visual speed, the audiovisual interactions in neural signals declined in the late positive component range. Compared with the findings from previous studies using different paradigms, our findings support the view that attentional modulations of crossmodal interactions are not unitary and depend on task-specific components. They also have important implications for motion processing and speed estimation in daily life situations where sensory relevance and attentional demands constantly change.


Subject(s)
Attention , Auditory Perception , Electroencephalography , Photic Stimulation , Visual Fields , Humans , Attention/physiology , Male , Female , Young Adult , Adult , Auditory Perception/physiology , Visual Fields/physiology , Photic Stimulation/methods , Motion Perception/physiology , Acoustic Stimulation , Visual Perception/physiology , Brain Mapping , Brain/physiology
13.
Front Neurosci ; 18: 1390696, 2024.
Article in English | MEDLINE | ID: mdl-39161654

ABSTRACT

Background: Deficits in Multisensory Integration (MSI) in ASD have been reported repeatedly and have been suggested to be caused by altered long-range connectivity. Here we investigate behavioral and ERP correlates of MSI in ASD using ecologically valid videos of emotional expressions. Methods: In the present study, we set out to investigate the electrophysiological correlates of audiovisual MSI in young autistic and neurotypical adolescents. We employed dynamic stimuli of high ecological validity (500 ms clips produced by actors) that depicted fear or disgust in unimodal (visual and auditory), and bimodal (audiovisual) conditions. Results: We report robust MSI effects at both the behavioral and electrophysiological levels and pronounced differences between autistic and neurotypical participants. Specifically, neurotypical controls showed robust behavioral MSI for both emotions as seen through a significant speed-up of bimodal response time (RT), confirmed by Miller's Race Model Inequality (RMI), with greater MSI effects for fear than disgust. Adolescents with ASD, by contrast, showed behavioral MSI only for fear. At the electrophysiological level, the bimodal condition as compared to the unimodal conditions reduced the amplitudes of the visual P100 and auditory P200 and increased the amplitude of the visual N170 regardless of group. Furthermore, a cluster-based analysis across all electrodes revealed that adolescents with ASD showed an overall delayed and spatially constrained MSI effect compared to controls. Conclusion: Given that the variables we measured reflect attention, our findings suggest that MSI can be modulated by the differential effects on attention that fear and disgust produce. We also argue that the MSI deficits seen in autistic individuals can be compensated for at later processing stages by (a) the attention-orienting effects of fear, at the behavioral level, and (b) at the electrophysiological level via increased attentional effort.

14.
eNeuro ; 11(9)2024 Sep.
Article in English | MEDLINE | ID: mdl-39147580

ABSTRACT

The accurate estimation of limb state is necessary for movement planning and execution. While state estimation requires both feedforward and feedback information, we focus here on the latter. Prior literature has shown that integrating visual and proprioceptive feedback improves estimates of static limb position. However, differences in visual and proprioceptive feedback delays suggest that multisensory integration could be disadvantageous when the limb is moving. We formalized this hypothesis by modeling feedback-based state estimation using the long-standing maximum likelihood estimation model of multisensory integration, which we updated to account for sensory delays. Our model predicted that the benefit of multisensory integration was largely lost when the limb was passively moving. We tested this hypothesis in a series of experiments in human subjects that compared the degree of interference created by discrepant visual or proprioceptive feedback when estimating limb position either statically at the end of the movement or dynamically at movement midpoint. In the static case, we observed significant interference: discrepant feedback in one modality systematically biased sensory estimates based on the other modality. However, no interference was seen in the dynamic case: participants could ignore sensory feedback from one modality and accurately reproduce the motion indicated by the other modality. Together, these findings suggest that the sensory feedback used to compute a state estimate differs depending on whether the limb is stationary or moving. While the former may tend toward multimodal integration, the latter is more likely to be based on feedback from a single sensory modality.


Subject(s)
Feedback, Sensory , Movement , Proprioception , Humans , Male , Feedback, Sensory/physiology , Female , Proprioception/physiology , Young Adult , Adult , Movement/physiology , Visual Perception/physiology , Psychomotor Performance/physiology
15.
Adv Mater ; 36(36): e2407751, 2024 Sep.
Article in English | MEDLINE | ID: mdl-39011791

ABSTRACT

In the pursuit of artificial neural systems, the integration of multimodal plasticity, memory retention, and perceptual functions stands as a paramount objective in achieving neuromorphic perceptual components inspired by the human brain, to emulating the neurological excitability tuning observed in human visual and respiratory collaborations. Here, an artificial visual-respiratory synapse is presented with monolayer oxidized MXene (VRSOM) exhibiting synergistic light and atmospheric plasticity. The VRSOM enables to realize facile modulation of synaptic behaviors, encompassing postsynaptic current, sustained photoconductivity, stable facilitation/depression properties, and "learning-experience" behavior. These performances rely on the privileged photocarrier trapping characteristics and the hydroxyl-preferential selectivity inherent of oxidized vacancies. Moreover, environment recognitions and multimodal neural network image identifications are achieved through multisensory integration, underscoring the potential of the VRSOM in reproducing human-like perceptual attributes. The VRSOM platform holds significant promise for hardware output of human-like mixed-modal interactions and paves the way for perceiving multisensory neural behaviors in artificial interactive devices.


Subject(s)
Synapses , Synapses/physiology , Humans , Oxidation-Reduction , Biomimetic Materials/chemistry , Neural Networks, Computer , Respiration
16.
J Neurophysiol ; 132(2): 544-569, 2024 Aug 01.
Article in English | MEDLINE | ID: mdl-38985936

ABSTRACT

Wide-range thermoreceptive neurons (WRT-EN) in monkey cortical area 7b that encoded innocuous and nocuous cutaneous thermal and threatening visuosensory stimulation with high fidelity were studied to identify their multisensory integrative response properties. Emphasis was given to characterizing the spatial and temporal effects of threatening visuosensory input on the thermal stimulus-response properties of these multisensory nociceptive neurons. Threatening visuosensory stimulation was most efficacious in modulating thermal evoked responses when presented as a downward ("looming"), spatially congruent, approaching and closely proximal target in relation to the somatosensory receptive field. Both temporal alignment and misalignment of spatially aligned threatening visual and thermal stimulation significantly increased mean discharge frequencies above those evoked by thermal stimulation alone, particularly at near noxious (43°C) and mildly noxious (45°C) temperatures. The enhanced multisensory discharge frequencies were equivalent to the discharge frequency evoked by overtly noxious thermal stimulation alone at 47°C (monkey pain tolerance threshold). A significant increase in behavioral mean escape frequency with shorter escape latency was evoked by multisensory stimulation at near noxious temperature (43°C), which was equivalent to that evoked by noxious stimulation alone (47°C). The remarkable concordance of elevating both neural discharge and escape frequency from a nonnociceptive and prepain level by near noxious thermal stimulation to a nociceptive and pain level by multisensory visual and near noxious thermal stimulation and integration is an elegantly designed defensive neural mechanism that in effect lowers both nociceptive response and pain thresholds to preemptively engage nocifensive behavior and, consequently, avert impending and actual injurious noxious thermal stimulation.NEW & NOTEWORTHY Multisensory nociceptive neurons in cortical area 7b are engaged in integration of threatening visuosensory and a wide range of innocuous and nocuous somatosensory (thermoreceptive) inputs. The enhancement of neuronal activity and escape behavior in monkey by multisensory integration is consistent and supportive of human psychophysical studies. The spatial features of visuosensory stimulation in peripersonal space in relation to somatic stimulation in personal space are critical to multisensory integration, nociception, nocifensive behavior, and pain.


Subject(s)
Macaca mulatta , Nociceptors , Animals , Nociceptors/physiology , Male , Nociception/physiology , Hot Temperature , Visual Perception/physiology , Pain Threshold/physiology , Photic Stimulation , Escape Reaction/physiology , Thermoreceptors/physiology
17.
Neurosci Biobehav Rev ; 164: 105814, 2024 Sep.
Article in English | MEDLINE | ID: mdl-39032842

ABSTRACT

Visuomanual prism adaptation (PA), which consists of pointing to visual targets while wearing prisms that shift the visual field, is one of the oldest experimental paradigms used to investigate sensorimotor plasticity. Since the 2000's, a growing scientific interest emerged for the expansion of PA to cognitive functions in several sensory modalities. The present work focused on the aftereffects of PA within the auditory modality. Recent studies showed changes in mental representation of auditory frequencies and a shift of divided auditory attention following PA. Moreover, one study demonstrated benefits of PA in a patient suffering from tinnitus. According to these results, we tried to shed light on the following question: How could this be possible to modulate audition by inducing sensorimotor plasticity with glasses? Based on the literature, we suggest a bottom-up attentional mechanism involving cerebellar, parietal, and temporal structures to explain crossmodal aftereffects of PA. This review opens promising new avenues of research about aftereffects of PA in audition and its implication in the therapeutic field of auditory troubles.


Subject(s)
Adaptation, Physiological , Auditory Perception , Humans , Auditory Perception/physiology , Adaptation, Physiological/physiology , Visual Perception/physiology , Attention/physiology , Figural Aftereffect/physiology
18.
Curr Biol ; 34(16): 3616-3631.e5, 2024 Aug 19.
Article in English | MEDLINE | ID: mdl-39019036

ABSTRACT

Effective detection and avoidance from environmental threats are crucial for animals' survival. Integration of sensory cues associated with threats across different modalities can significantly enhance animals' detection and behavioral responses. However, the neural circuit-level mechanisms underlying the modulation of defensive behavior or fear response under simultaneous multimodal sensory inputs remain poorly understood. Here, we report in mice that bimodal looming stimuli combining coherent visual and auditory signals elicit more robust defensive/fear reactions than unimodal stimuli. These include intensified escape and prolonged hiding, suggesting a heightened defensive/fear state. These various responses depend on the activity of the superior colliculus (SC), while its downstream nucleus, the parabigeminal nucleus (PBG), predominantly influences the duration of hiding behavior. PBG temporally integrates visual and auditory signals and enhances the salience of threat signals by amplifying SC sensory responses through its feedback projection to the visual layer of the SC. Our results suggest an evolutionarily conserved pathway in defense circuits for multisensory integration and cross-modality enhancement.


Subject(s)
Fear , Superior Colliculi , Animals , Superior Colliculi/physiology , Mice , Fear/physiology , Male , Mice, Inbred C57BL , Visual Perception/physiology , Auditory Perception/physiology , Escape Reaction/physiology , Acoustic Stimulation , Photic Stimulation , Female
19.
bioRxiv ; 2024 Sep 14.
Article in English | MEDLINE | ID: mdl-39071445

ABSTRACT

In a real-world environment, the brain must integrate information from multiple sensory modalities, including the auditory and olfactory systems. However, little is known about the neuronal circuits governing how odors influence and modulate sound processing. Here, we investigated the mechanisms underlying auditory-olfactory integration using anatomical, electrophysiological, and optogenetic approaches, focusing on the auditory cortex as a key locus for cross-modal integration. First, retrograde and anterograde viral tracing strategies revealed a direct projection from the piriform cortex to the auditory cortex. Next, using in vivo electrophysiological recordings of neuronal activity in the auditory cortex of awake male or female mice, we found that odors modulate auditory cortical responses to sound. Finally, we used in vivo optogenetic manipulations during electrophysiology to demonstrate that olfactory modulation in auditory cortex, specifically, odor-driven enhancement of sound responses, depends on direct input from the piriform cortex. Together, our results identify a novel role of piriform-to-auditory cortical circuitry in shaping olfactory modulation in the auditory cortex, shedding new light on the neuronal mechanisms underlying auditory-olfactory integration.

20.
Hum Brain Mapp ; 45(10): e26772, 2024 Jul 15.
Article in English | MEDLINE | ID: mdl-38962966

ABSTRACT

Humans naturally integrate signals from the olfactory and intranasal trigeminal systems. A tight interplay has been demonstrated between these two systems, and yet the neural circuitry mediating olfactory-trigeminal (OT) integration remains poorly understood. Using functional magnetic resonance imaging (fMRI), combined with psychophysics, this study investigated the neural mechanisms underlying OT integration. Fifteen participants with normal olfactory function performed a localization task with air-puff stimuli, phenylethyl alcohol (PEA; rose odor), or a combination thereof while being scanned. The ability to localize PEA to either nostril was at chance. Yet, its presence significantly improved the localization accuracy of weak, but not strong, air-puffs, when both stimuli were delivered concurrently to the same nostril, but not when different nostrils received the two stimuli. This enhancement in localization accuracy, exemplifying the principles of spatial coincidence and inverse effectiveness in multisensory integration, was associated with multisensory integrative activity in the primary olfactory (POC), orbitofrontal (OFC), superior temporal (STC), inferior parietal (IPC) and cingulate cortices, and in the cerebellum. Multisensory enhancement in most of these regions correlated with behavioral multisensory enhancement, as did increases in connectivity between some of these regions. We interpret these findings as indicating that the POC is part of a distributed brain network mediating integration between the olfactory and trigeminal systems. PRACTITIONER POINTS: Psychophysical and neuroimaging study of olfactory-trigeminal (OT) integration. Behavior, cortical activity, and network connectivity show OT integration. OT integration obeys principles of inverse effectiveness and spatial coincidence. Behavioral and neural measures of OT integration are correlated.


Subject(s)
Brain Mapping , Magnetic Resonance Imaging , Olfactory Cortex , Humans , Male , Female , Adult , Olfactory Cortex/physiology , Olfactory Cortex/diagnostic imaging , Young Adult , Olfactory Perception/physiology , Phenylethyl Alcohol , Psychophysics , Trigeminal Nerve/physiology , Trigeminal Nerve/diagnostic imaging , Odorants
SELECTION OF CITATIONS
SEARCH DETAIL