Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 28
Filter
Add more filters










Publication year range
1.
Curr Biol ; 34(10): R494-R496, 2024 05 20.
Article in English | MEDLINE | ID: mdl-38772335

ABSTRACT

Humans show perceptual biases that suggest distorted internal representations of their own body. New research reveals that these perceptual biases can reflect integration of prior assumptions about body posture rather than a misshaped representation of the body's geometry.


Subject(s)
Body Image , Humans , Body Image/psychology , Posture
2.
Psychon Bull Rev ; 2024 Feb 22.
Article in English | MEDLINE | ID: mdl-38388825

ABSTRACT

The ability to judge the temporal alignment of visual and auditory information is a prerequisite for multisensory integration and segregation. However, each temporal measurement is subject to error. Thus, when judging whether a visual and auditory stimulus were presented simultaneously, observers must rely on a subjective decision boundary to distinguish between measurement error and truly misaligned audiovisual signals. Here, we tested whether these decision boundaries are relaxed with increasing temporal sensory uncertainty, i.e., whether participants make the same type of adjustment an ideal observer would make. Participants judged the simultaneity of audiovisual stimulus pairs with varying temporal offset, while being immersed in different virtual environments. To obtain estimates of participants' temporal sensory uncertainty and simultaneity criteria in each environment, an independent-channels model was fitted to their simultaneity judgments. In two experiments, participants' simultaneity decision boundaries were predicted by their temporal uncertainty, which varied unsystematically with the environment. Hence, observers used a flexibly updated estimate of their own audiovisual temporal uncertainty to establish subjective criteria of simultaneity. This finding implies that, under typical circumstances, audiovisual simultaneity windows reflect an observer's cross-modal temporal uncertainty.

3.
Philos Trans R Soc Lond B Biol Sci ; 378(1886): 20220345, 2023 09 25.
Article in English | MEDLINE | ID: mdl-37545302

ABSTRACT

Multisensory integration depends on causal inference about the sensory signals. We tested whether implicit causal-inference judgements pertain to entire objects or focus on task-relevant object features. Participants in our study judged virtual visual, haptic and visual-haptic surfaces with respect to two features-slant and roughness-against an internal standard in a two-alternative forced-choice task. Modelling of participants' responses revealed that the degree to which their perceptual judgements were based on integrated visual-haptic information varied unsystematically across features. For example, a perceived mismatch between visual and haptic roughness would not deter the observer from integrating visual and haptic slant. These results indicate that participants based their perceptual judgements on a feature-specific selection of information, suggesting that multisensory causal inference proceeds not at the object level but at the level of single object features. This article is part of the theme issue 'Decision and control processes in multisensory perception'.


Subject(s)
Touch Perception , Humans , Touch Perception/physiology , Visual Perception/physiology , Judgment
4.
Cognition ; 238: 105528, 2023 09.
Article in English | MEDLINE | ID: mdl-37354787

ABSTRACT

Combining information from multiple senses enhances our perception of the world. Whether we need to be aware of all stimuli to benefit from multisensory integration, however, is still under investigation. Here, we tested whether tactile frequency perception benefits from the presence of congruent visual flicker even if the flicker is so rapid that it is perceptually fused into a steady light and therefore invisible. Our participants completed a tactile frequency discrimination task given either unisensory tactile or congruent tactile-visual stimulation. Tactile and tactile-visual test frequencies ranged from far below to far above participants' flicker fusion threshold (determined separately). For frequencies distinctively below their flicker fusion threshold, participants performed significantly better given tactile-visual stimulation than when presented with only tactile stimuli. Yet, for frequencies above their flicker fusion threshold, participants' tactile frequency perception did not profit from the presence of congruent but likely fused and thus invisible visual flicker. The results matched the predictions of an ideal-observer model in which tactile-visual integration is conditional on awareness of both stimuli. In contrast, it was impossible to reproduce the observed results with a model that assumed tactile-visual integration proceeds irrespective of stimulus awareness. In sum, we revealed that the benefits of congruent visual stimulation for tactile flutter frequency perception depend on the visibility of the visual flicker, suggesting that multisensory integration requires awareness.


Subject(s)
Touch Perception , Touch , Humans , Touch/physiology , Touch Perception/physiology , Photic Stimulation/methods , Visual Perception/physiology
5.
Proc Natl Acad Sci U S A ; 120(15): e2209680120, 2023 04 11.
Article in English | MEDLINE | ID: mdl-37014855

ABSTRACT

Our skin is a two-dimensional sheet that can be folded into a multitude of configurations due to the mobility of our body parts. Parts of the human tactile system might account for this flexibility by being tuned to locations in the world rather than on the skin. Using adaptation, we scrutinized the spatial selectivity of two tactile perceptual mechanisms for which the visual equivalents have been reported to be selective in world coordinates: tactile motion and the duration of tactile events. Participants' hand position-uncrossed or crossed-as well as the stimulated hand varied independently across adaptation and test phases. This design distinguished among somatotopic selectivity for locations on the skin and spatiotopic selectivity for locations in the environment, but also tested spatial selectivity that fits neither of these classical reference frames and is based on the default position of the hands. For both features, adaptation consistently affected subsequent tactile perception at the adapted hand, reflecting skin-bound spatial selectivity. Yet, tactile motion and temporal adaptation also transferred across hands but only if the hands were crossed during the adaptation phase, that is, when one hand was placed at the other hand's typical location. Thus, selectivity for locations in the world was based on default rather than online sensory information about the location of the hands. These results challenge the prevalent dichotomy of somatotopic and spatiotopic selectivity and suggest that prior information about the hands' default position -right hand at the right side-is embedded deep in the tactile sensory system.


Subject(s)
Space Perception , Touch Perception , Humans , Hand , Touch , Posture
6.
Eur J Radiol ; 160: 110708, 2023 Mar.
Article in English | MEDLINE | ID: mdl-36724687

ABSTRACT

PURPOSE: Hepatic steatosis is often diagnosed non-invasively. Various measures and accompanying diagnostic thresholds based on contrast-enhanced CT and virtual non-contrast images have been proposed. We compare these established criteria to novel and fully automated measures. METHOD: CT data sets of 197 patients were analyzed. Regions of interest (ROIs) were manually drawn for the liver, spleen, portal vein, and aorta to calculate four established measures of liver-fat. Two novel measures capturing the deviation between the empirical distributions of HU measurements across all voxels within the liver and spleen were calculated. These measures were calculated with both manual ROIs and using fully automated organ segmentations. Agreement between the different measures was evaluated using correlational analysis, as well as their ability to discriminate between fatty and healthy liver. RESULTS: Established and novel measures of fatty liver were at a high level of agreement. Novel methods were statistically indistinguishable from the established ones when taking established diagnostic thresholds or physicians' diagnoses as ground truth and this high performance level persisted for automatically selected ROIs. CONCLUSION: Automatically generated organ segmentations led to comparable results as manual ROIs, suggesting that the implementation of automated methods can prove to be a valuable tool for incidental diagnosis. Differences in the distribution of HU measurements across voxels between liver and spleen can serve as surrogate markers for the liver-fat-content. Novel measures do not exhibit a measurable disadvantage over established methods based on simpler measures such as across-voxel averages in a population with low incidence of fatty liver.


Subject(s)
Fatty Liver , Humans , Fatty Liver/diagnostic imaging , Tomography, X-Ray Computed/methods , Portal Vein , Computers
7.
Sci Rep ; 12(1): 15532, 2022 09 15.
Article in English | MEDLINE | ID: mdl-36109544

ABSTRACT

To estimate an environmental property such as object location from multiple sensory signals, the brain must infer their causal relationship. Only information originating from the same source should be integrated. This inference relies on the characteristics of the measurements, the information the sensory modalities provide on a given trial, as well as on a cross-modal common-cause prior: accumulated knowledge about the probability that cross-modal measurements originate from the same source. We examined the plasticity of this cross-modal common-cause prior. In a learning phase, participants were exposed to a series of audiovisual stimuli that were either consistently spatiotemporally congruent or consistently incongruent; participants' audiovisual spatial integration was measured before and after this exposure. We fitted several Bayesian causal-inference models to the data; the models differed in the plasticity of the common-source prior. Model comparison revealed that, for the majority of the participants, the common-cause prior changed during the learning phase. Our findings reveal that short periods of exposure to audiovisual stimuli with a consistent causal relationship can modify the common-cause prior. In accordance with previous studies, both exposure conditions could either strengthen or weaken the common-cause prior at the participant level. Simulations imply that the direction of the prior-update might be mediated by the degree of sensory noise, the variability of the measurements of the same signal across trials, during the learning phase.


Subject(s)
Learning , Visual Perception , Bayes Theorem , Humans , Noise , Probability
8.
PLoS Comput Biol ; 17(11): e1008877, 2021 11.
Article in English | MEDLINE | ID: mdl-34780469

ABSTRACT

To obtain a coherent perception of the world, our senses need to be in alignment. When we encounter misaligned cues from two sensory modalities, the brain must infer which cue is faulty and recalibrate the corresponding sense. We examined whether and how the brain uses cue reliability to identify the miscalibrated sense by measuring the audiovisual ventriloquism aftereffect for stimuli of varying visual reliability. To adjust for modality-specific biases, visual stimulus locations were chosen based on perceived alignment with auditory stimulus locations for each participant. During an audiovisual recalibration phase, participants were presented with bimodal stimuli with a fixed perceptual spatial discrepancy; they localized one modality, cued after stimulus presentation. Unimodal auditory and visual localization was measured before and after the audiovisual recalibration phase. We compared participants' behavior to the predictions of three models of recalibration: (a) Reliability-based: each modality is recalibrated based on its relative reliability-less reliable cues are recalibrated more; (b) Fixed-ratio: the degree of recalibration for each modality is fixed; (c) Causal-inference: recalibration is directly determined by the discrepancy between a cue and its estimate, which in turn depends on the reliability of both cues, and inference about how likely the two cues derive from a common source. Vision was hardly recalibrated by audition. Auditory recalibration by vision changed idiosyncratically as visual reliability decreased: the extent of auditory recalibration either decreased monotonically, peaked at medium visual reliability, or increased monotonically. The latter two patterns cannot be explained by either the reliability-based or fixed-ratio models. Only the causal-inference model of recalibration captures the idiosyncratic influences of cue reliability on recalibration. We conclude that cue reliability, causal inference, and modality-specific biases guide cross-modal recalibration indirectly by determining the perception of audiovisual stimuli.


Subject(s)
Auditory Perception/physiology , Spatial Processing/physiology , Visual Perception/physiology , Acoustic Stimulation , Adult , Attentional Bias/physiology , Brain/physiology , Causality , Computational Biology , Cues , Female , Humans , Male , Models, Neurological , Models, Psychological , Photic Stimulation , Reproducibility of Results , Sound Localization/physiology , Space Perception/physiology , Young Adult
9.
Nat Commun ; 12(1): 5987, 2021 10 13.
Article in English | MEDLINE | ID: mdl-34645793

ABSTRACT

Following prolonged exposure to hypoxic conditions, for example, due to ascent to high altitude, stroke, or traumatic brain injury, cerebral edema can develop. The exact nature and genesis of hypoxia-induced edema in healthy individuals remain unresolved. We examined the effects of prolonged, normobaric hypoxia, induced by 16 h of exposure to simulated high altitude, on healthy brains using proton, dynamic contrast enhanced, and sodium MRI. This dual approach allowed us to directly measure key factors in the development of hypoxia-induced brain edema: (1) Sodium signals as a surrogate of the distribution of electrolytes within the cerebral tissue and (2) Ktrans as a marker of blood-brain-barrier integrity. The measurements point toward an accumulation of sodium ions in extra- but not in intracellular space in combination with an intact endothelium. Both findings in combination are indicative of ionic extracellular edema, a subtype of cerebral edema that was only recently specified as an intermittent, yet distinct stage between cytotoxic and vasogenic edemas. In sum, here a combination of imaging techniques demonstrates the development of ionic edemas following prolonged normobaric hypoxia in agreement with cascadic models of edema formation.


Subject(s)
Altitude Sickness/pathology , Brain Edema/pathology , Brain/pathology , Hypoxia/pathology , Adult , Altitude Sickness/diagnostic imaging , Altitude Sickness/metabolism , Blood-Brain Barrier/metabolism , Brain/diagnostic imaging , Brain/metabolism , Brain Edema/diagnostic imaging , Brain Edema/metabolism , Cohort Studies , Female , Humans , Hypoxia/diagnostic imaging , Hypoxia/metabolism , Magnetic Resonance Imaging , Male , Organ Size , Sodium/metabolism
11.
Elife ; 92020 08 25.
Article in English | MEDLINE | ID: mdl-32840213

ABSTRACT

Typical human perception features stable biases such as perceiving visual events as later than synchronous auditory events. The origin of such perceptual biases is unknown. To investigate the role of early sensory experience, we tested whether a congenital, transient loss of pattern vision, caused by bilateral dense cataracts, has sustained effects on audio-visual and tactile-visual temporal biases and resolution. Participants judged the temporal order of successively presented, spatially separated events within and across modalities. Individuals with reversed congenital cataracts showed a bias towards perceiving visual stimuli as occurring earlier than auditory (Expt. 1) and tactile (Expt. 2) stimuli. This finding stood in stark contrast to normally sighted controls and sight-recovery individuals who had developed cataracts later in childhood: both groups exhibited the typical bias of perceiving vision as delayed compared to audition. These findings provide strong evidence that cross-modal temporal biases depend on sensory experience during an early sensitive period.


Subject(s)
Auditory Perception , Touch Perception , Vision, Ocular , Visual Perception , Adolescent , Adult , Child , Child, Preschool , Female , Humans , Infant , Male , Young Adult
12.
Nat Commun ; 11(1): 3341, 2020 07 03.
Article in English | MEDLINE | ID: mdl-32620746

ABSTRACT

The oculomotor system keeps the eyes steady in expectation of visual events. Here, recording microsaccades while people performed a tactile, frequency discrimination task enabled us to test whether the oculomotor system shows an analogous preparatory response for unrelated tactile events. We manipulated the temporal predictability of tactile targets using tactile cues, which preceded the target by either constant (high predictability) or variable (low predictability) time intervals. We find that microsaccades are inhibited prior to tactile targets and more so for constant than variable intervals, revealing a tight crossmodal link between tactile temporal expectation and oculomotor action. These findings portray oculomotor freezing as a marker of crossmodal temporal expectation. Moreover, microsaccades occurring around the tactile target presentation are associated with reduced task performance, suggesting that oculomotor freezing mitigates potential detrimental, concomitant effects of microsaccades and revealing a crossmodal coupling between tactile perception and oculomotor action.


Subject(s)
Eye Movements/physiology , Psychomotor Performance/physiology , Time Perception/physiology , Touch Perception/physiology , Touch/physiology , Visual Perception/physiology , Adult , Cues , Female , Humans , Male , Oculomotor Muscles/innervation , Oculomotor Muscles/physiology , Photic Stimulation , Reaction Time/physiology , Saccades/physiology , Young Adult
13.
Cognition ; 197: 104170, 2020 04.
Article in English | MEDLINE | ID: mdl-32036027

ABSTRACT

At any moment in time, streams of information reach the brain through the different senses. Given this wealth of noisy information, it is essential that we select information of relevance - a function fulfilled by attention - and infer its causal structure to eventually take advantage of redundancies across the senses. Yet, the role of selective attention during causal inference in cross-modal perception is unknown. We tested experimentally whether the distribution of attention across vision and touch enhances cross-modal spatial integration (visual-tactile ventriloquism effect, Expt. 1) and recalibration (visual-tactile ventriloquism aftereffect, Expt. 2) compared to modality-specific attention, and then used causal-inference modeling to isolate the mechanisms behind the attentional modulation. In both experiments, we found stronger effects of vision on touch under distributed than under modality-specific attention. Model comparison confirmed that participants used Bayes-optimal causal inference to localize visual and tactile stimuli presented as part of a visual-tactile stimulus pair, whereas simultaneously collected unity judgments - indicating whether the visual-tactile pair was perceived as spatially-aligned - relied on a sub-optimal heuristic. The best-fitting model revealed that attention modulated sensory and cognitive components of causal inference. First, distributed attention led to an increase of sensory noise compared to selective attention toward one modality. Second, attending to both modalities strengthened the stimulus-independent expectation that the two signals belong together, the prior probability of a common source for vision and touch. Yet, only the increase in the expectation of vision and touch sharing a common source was able to explain the observed enhancement of visual-tactile integration and recalibration effects with distributed attention. In contrast, the change in sensory noise explained only a fraction of the observed enhancements, as its consequences vary with the overall level of noise and stimulus congruency. Increased sensory noise leads to enhanced integration effects for visual-tactile pairs with a large spatial discrepancy, but reduced integration effects for stimuli with a small or no cross-modal discrepancy. In sum, our study indicates a weak a priori association between visual and tactile spatial signals that can be strengthened by distributing attention across both modalities.


Subject(s)
Motivation , Touch , Attention , Bayes Theorem , Humans , Photic Stimulation , Visual Perception
14.
Autism Res ; 12(12): 1745-1757, 2019 12.
Article in English | MEDLINE | ID: mdl-31507084

ABSTRACT

Children with autism spectrum disorders (ASDs) often exhibit altered representations of the external world. Consistently, when localizing touch, children with ASDs were less influenced than their peers by changes of the stimulated limb's location in external space [Wada et al., Scientific Reports 2015, 4(1), 5985]. However, given the protracted development of an external-spatial dominance in tactile processing in typically developing children, this difference might reflect a developmental delay rather than a set suppression of external space in ASDs. Here, adults with ASDs and matched control-participants completed (a) the tactile temporal order judgment (TOJ) task previously used to test external-spatial representation of touch in children with ASDs and (b) a tactile-visual cross-modal congruency (CC) task which assesses benefits of task-irrelevant visual stimuli on tactile localization in external space. In both experiments, participants localized tactile stimuli to the fingers of each hand, while holding their hands either crossed or uncrossed. Performance differences between hand postures reflect the influence of external-spatial codes. In both groups, tactile TOJ-performance markedly decreased when participants crossed their hands and CC-effects were especially large if the visual stimulus was presented at the same side of external space as the task-relevant touch. The absence of group differences was statistically confirmed using Bayesian statistical modeling: adults with ASDs weighted external-spatial codes comparable to typically developed adults during tactile and visual-tactile spatio-temporal tasks. Thus, atypicalities in the spatial coding of touch for children with ASDs appear to reflect a developmental delay rather than a stable characteristic of ASD. Autism Res 2019, 12: 1745-1757. © 2019 International Society for Autism Research, Wiley Periodicals, Inc. LAY SUMMARY: A touched limb's location can be described twofold, with respect to the body (right hand) or the external world (right side). Children and adolescents with autism spectrum disorder (ASD) reportedly rely less than their peers on the external world. Here, adults with and without ASDs completed two tactile localization tasks. Both groups relied to the same degree on external world locations. This opens the possibility that the tendency to relate touch to the external world is typical in individuals with ASDs but emerges with a delay.


Subject(s)
Autism Spectrum Disorder/physiopathology , Photic Stimulation/methods , Proprioception/physiology , Spatial Processing/physiology , Touch Perception/physiology , Adult , Bayes Theorem , Female , Humans , Judgment/physiology , Male
15.
Curr Biol ; 29(9): 1491-1497.e4, 2019 05 06.
Article in English | MEDLINE | ID: mdl-30955931

ABSTRACT

Where we perceive a touch putatively depends on topographic maps that code the touch's location on the skin [1] as well as its position in external space [2-5]. However, neither somatotopic nor external-spatial representations can account for atypical tactile percepts in some neurological patients and amputees; referral of touch to an absent or anaesthetized hand after stimulation of a foot [6, 7] or the contralateral hand [8-10] challenges the role of topographic representations when attributing touch to the limbs. Here, we show that even healthy adults systematically misattribute touch to other limbs. Participants received two tactile stimuli, each to a different limb-hand or foot-and reported which of all four limbs had been stimulated first. Hands and feet were either uncrossed or crossed to dissociate body-based and external-spatial representations [11-14]. Remarkably, participants regularly attributed the first touch to a limb that had received neither of the two stimuli. The erroneously reported, non-stimulated limb typically matched the correct limb with respect to limb type or body side. Touch was misattributed to non-stimulated limbs of the other limb type and body side only if they were placed at the correct limb's canonical (default) side of space. The touch's actual location in external space was irrelevant. These errors replicated across several contexts, and modeling linked them to incoming sensory evidence rather than to decision strategies. The results highlight the importance of the touched body part's identity and canonical location but challenge the role of external-spatial tactile representations when attributing touch to a limb.


Subject(s)
Foot/physiology , Hand/physiology , Touch Perception/physiology , Adult , Female , Humans , Male , Young Adult
16.
Atten Percept Psychophys ; 81(5): 1715-1724, 2019 Jul.
Article in English | MEDLINE | ID: mdl-30815794

ABSTRACT

There is an ongoing debate whether or not multisensory interactions require awareness of the sensory signals. Static visual and tactile stimuli have been shown to influence each other even in the absence of visual awareness. However, it is unclear if this finding generalizes to dynamic contexts. In the present study, we presented visual and tactile motion stimuli and induced fluctuations of visual awareness by means of binocular rivalry: two gratings which drifted in opposite directions were displayed, one to each eye. One visual motion stimulus dominated and reached awareness while the other visual stimulus was suppressed from awareness. Tactile motion stimuli were presented at random time points during the visual stimulation. The motion direction of a tactile stimulus always matched the direction of one of the concurrently presented visual stimuli. The visual gratings were differently tinted, and participants reported the color of the currently seen stimulus. Tactile motion delayed perceptual switches that ended dominance periods of congruently moving visual stimuli compared to switches during visual-only stimulation. In addition, tactile motion fostered the return to dominance of suppressed, congruently moving visual stimuli, but only if the tactile motion started at a late stage of the ongoing visual suppression period. At later stages, perceptual suppression is typically decreasing. These results suggest that visual awareness facilitates but does not gate multisensory interactions between visual and tactile motion signals.


Subject(s)
Motion Perception/physiology , Touch Perception/physiology , Vision, Binocular/physiology , Visual Perception/physiology , Awareness/physiology , Bias , Female , Humans , Male , Photic Stimulation/methods , Young Adult
17.
Elife ; 82019 02 08.
Article in English | MEDLINE | ID: mdl-30735123

ABSTRACT

The ability to act quickly to a threat is a key skill for survival. Under awareness, threat-related emotional information, such as an angry or fearful face, has not only perceptual advantages but also guides rapid actions such as eye movements. Emotional information that is suppressed from awareness still confers perceptual and attentional benefits. However, it is unknown whether suppressed emotional information can directly guide actions, or whether emotional information has to enter awareness to do so. We suppressed emotional faces from awareness using continuous flash suppression and tracked eye gaze position. Under successful suppression, as indicated by objective and subjective measures, gaze moved towards fearful faces, but away from angry faces. Our findings reveal that: (1) threat-related emotional stimuli can guide eye movements in the absence of visual awareness; (2) threat-related emotional face information guides distinct oculomotor actions depending on the type of threat conveyed by the emotional expression.


Subject(s)
Attention/physiology , Awareness/physiology , Emotions/physiology , Eye Movements/physiology , Adult , Face/physiology , Fear/physiology , Female , Humans , Male
18.
Atten Percept Psychophys ; 80(3): 773-783, 2018 Apr.
Article in English | MEDLINE | ID: mdl-29282652

ABSTRACT

It has been suggested that judgments about the temporal-spatial order of successive tactile stimuli depend on the perceived direction of apparent motion between them. Here we manipulated tactile apparent-motion percepts by presenting a brief, task-irrelevant auditory stimulus temporally in-between pairs of tactile stimuli. The tactile stimuli were applied one to each hand, with varying stimulus onset asynchronies (SOAs). Participants reported the location of the first stimulus (temporal order judgments: TOJs) while adopting both crossed and uncrossed hand postures, so we could scrutinize skin-based, anatomical, and external reference frames. With crossed hands, the sound improved TOJ performance at short (≤300 ms) and at long (>300 ms) SOAs. When the hands were uncrossed, the sound induced a decrease in TOJ performance, but only at short SOAs. A second experiment confirmed that the auditory stimulus indeed modulated tactile apparent motion perception under these conditions. Perceived apparent motion directions were more ambiguous with crossed than with uncrossed hands, probably indicating competing spatial codes in the crossed posture. However, irrespective of posture, the additional sound tended to impair potentially anatomically coded motion direction discrimination at a short SOA of 80 ms, but it significantly enhanced externally coded apparent motion perception at a long SOA of 500 ms. Anatomically coded motion signals imply incorrect TOJ responses with crossed hands, but correct responses when the hands are uncrossed; externally coded motion signals always point toward the correct TOJ response. Thus, taken together, these results suggest that apparent-motion signals are likely taken into account when tactile temporal-spatial information is reconstructed.


Subject(s)
Hand , Judgment/physiology , Motion Perception/physiology , Posture , Touch Perception/physiology , Adult , Female , Humans , Male , Sound , Task Performance and Analysis , Young Adult
19.
PLoS One ; 12(12): e0189067, 2017.
Article in English | MEDLINE | ID: mdl-29228023

ABSTRACT

Task demands modulate tactile localization in sighted humans, presumably through weight adjustments in the spatial integration of anatomical, skin-based, and external, posture-based information. In contrast, previous studies have suggested that congenitally blind humans, by default, refrain from automatic spatial integration and localize touch using only skin-based information. Here, sighted and congenitally blind participants localized tactile targets on the palm or back of one hand, while ignoring simultaneous tactile distractors at congruent or incongruent locations on the other hand. We probed the interplay of anatomical and external location codes for spatial congruency effects by varying hand posture: the palms either both faced down, or one faced down and one up. In the latter posture, externally congruent target and distractor locations were anatomically incongruent and vice versa. Target locations had to be reported either anatomically ("palm" or "back" of the hand), or externally ("up" or "down" in space). Under anatomical instructions, performance was more accurate for anatomically congruent than incongruent target-distractor pairs. In contrast, under external instructions, performance was more accurate for externally congruent than incongruent pairs. These modulations were evident in sighted and blind individuals. Notably, distractor effects were overall far smaller in blind than in sighted participants, despite comparable target-distractor identification performance. Thus, the absence of developmental vision seems to be associated with an increased ability to focus tactile attention towards a non-spatially defined target. Nevertheless, that blind individuals exhibited effects of hand posture and task instructions in their congruency effects suggests that, like the sighted, they automatically integrate anatomical and external information during tactile localization. Moreover, spatial integration in tactile processing is, thus, flexibly adapted by top-down information-here, task instruction-even in the absence of developmental vision.


Subject(s)
Blindness/physiopathology , Task Performance and Analysis , Touch , Adult , Blindness/congenital , Female , Humans , Male , Middle Aged , Young Adult
20.
Sci Rep ; 6: 31269, 2016 08 10.
Article in English | MEDLINE | ID: mdl-27507776

ABSTRACT

The cascade of inflammatory pathogenetic mechanisms in multiple sclerosis (MS) has no specific conventional MRI correlates. Clinicians therefore stipulate improved imaging specificity to define the pathological substrates of MS in vivo including mapping of intracellular sodium accumulation. Based upon preclinical findings and results of previous sodium MRI studies in MS patients we hypothesized that the fluid-attenuated sodium signal differs between acute and chronic lesions. We acquired brain sodium and proton MRI data of N = 29 MS patients; lesion type was defined by the presence or absence of contrast enhancement. N = 302 MS brain lesions were detected, and generalized linear mixed models were applied to predict lesion type based on sodium signals; thereby controlling for varying numbers of lesions among patients and confounding variables such as age and medication. Hierarchical model comparisons revealed that both sodium signals average tissue (χ(2)(1) = 27.89, p < 0.001) and fluid-attenuated (χ(2)(1) = 5.76, p = 0.016) improved lesion type classification. Sodium MRI signals were significantly elevated in acute compared to chronic lesions compatible with intracellular sodium accumulation in acute MS lesions. If confirmed in further studies, sodium MRI could serve as biomarker for diagnostic assessment of MS, and as readout parameter in clinical trials promoting attenuation of chronic inflammation.


Subject(s)
Brain/diagnostic imaging , Inflammation/diagnostic imaging , Magnetic Resonance Imaging , Multiple Sclerosis/diagnostic imaging , Sodium/chemistry , Acute Disease , Adolescent , Adult , Chronic Disease , Female , Humans , Linear Models , Male , Multiple Sclerosis/pathology , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...