Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 2.985
Filter
1.
PLoS One ; 19(9): e0306536, 2024.
Article in English | MEDLINE | ID: mdl-39250483

ABSTRACT

People naturally seek an interpersonal distance that feels comfortable, striking a balance between not being too close or too far from others until reaching a state of equilibrium. Previous studies on interpersonal distance preferences among autistic individuals have yielded inconsistent results. Some show a preference for greater distance, while others indicate a preference for shorter distances, or reveal higher variance in preferences among autistic individuals. In a related vein, previous studies have also investigated the way autistics accurately judge distance, and these studies have received inconsistent results, with some showing superior spatial abilities and others indicating biases in distance estimations. However, the link between distance estimation and preference has never been examined. To address this gap, our study measured interpersonal distance preferences and estimations and tested the correlation between the two factors. The results indicate greater variance among autistic people in both the preference of distance and the ability to estimate distance accurately, suggesting that inconsistencies in previous studies originate from greater individual differences among autistics. Furthermore, only among autistic individuals were interpersonal distance preference and estimation bias associated in a manner that violated equilibrium. Underestimation bias (judging others as closer than they are) was linked to a preference for closer proximity, while overestimation bias (judging others as further away) was associated with a preference for maintaining a greater distance. This connection suggests that biases in the estimation of interpersonal distance contribute to extreme preferences (being too close or too far away). Taken together, the findings suggest that biases in the estimation of interpersonal distance are associated with socially inappropriate distance preferences among autistics.


Subject(s)
Autistic Disorder , Interpersonal Relations , Humans , Autistic Disorder/psychology , Male , Female , Adult , Young Adult , Distance Perception , Adolescent
2.
PLoS One ; 19(9): e0305661, 2024.
Article in English | MEDLINE | ID: mdl-39321156

ABSTRACT

Although estimating travel distance is essential to our ability to move through the world, our distance estimates can be inaccurate. These odometric errors occur because people tend to perceive that they have moved further than they had. Many of the studies investigating the perception of travel distance have primarily used forward translational movements, and postulate that perceived travel distance results from integration over distance and is independent of travel speed. Speed effects would imply integration over time as well as space. To examine travel distance perception with different directions and speeds, we used virtual reality (VR) to elicit visually induced self-motion. Participants (n = 15) were physically stationary while being visually "moved" through a virtual corridor, either judging distances by stopping at a previously seen target (Move-To-Target Task) or adjusting a target to the previous movement made (Adjust-Target Task). We measured participants' perceived travel distance over a range of speeds (1-5 m/s) and distances in four directions (up, down, forward, backward). We show that the simulated speed and direction of motion differentially affect the gain (perceived travel distance / actual travel distance). For the Adjust-Target task, forwards motion was associated with smaller gains than either backward, up, or down motion. For the Move-To-Target task, backward motion was associated with smaller gains than either forward, up or down motion. For both tasks, motion at the slower speed was associated with higher gains than the faster speeds. These results show that transforming visual motion into travel distance differs depending on the speed and direction of optic flow being perceived. We also found that a common model used to study the perception of travel distance was a better fit for the forward direction compared to the others. This implies that the model should be modified for these different non-forward motion directions.


Subject(s)
Distance Perception , Motion Perception , Humans , Male , Female , Distance Perception/physiology , Adult , Motion Perception/physiology , Young Adult , Virtual Reality , Motion , Movement/physiology
3.
Traffic Inj Prev ; 25(7): 919-924, 2024.
Article in English | MEDLINE | ID: mdl-39088758

ABSTRACT

OBJECTIVES: Child pedestrian injuries represent a significant public health challenge. Understanding the most complex cognitive skills required to cross streets helps us understand, improve, and protect children in traffic, as underdeveloped cognitive skill likely impacts children's pedestrian safety. One complex component of street-crossing is the cognitive-perceptual task of judging time-to-arrival of oncoming traffic. We examined capacity of 7- and 8-year-olds to judge time-to-arrival for vehicles approaching from varying distances and speeds, as well as improvement in those judgments following intensive street-crossing training in a virtual reality (VR) pedestrian simulator. METHODS: 500 seven- and eight-year-olds participated in a randomized trial evaluating use of a large kiosk VR versus smartphone-based VR headset to teach street-crossing skills. Prior to randomization into VR training condition and also prior to initiation of any training, children engaged in a video-based vehicle approach estimation task to assess ability to judge traffic time-to-arrival. They then engaged in multiple VR-based pedestrian safety training sessions in their randomly assigned condition until achieving adult functioning. Soon after training and again 6 months later, children repeated the vehicle estimation task. RESULTS: Prior to randomization or training, children were more accurate judging time to arrival for closer versus farther traffic, and rapidly-moving versus slower-moving traffic, but those results were subsumed by a speed x distance interaction. The interaction suggested distance cues were used more prominently than speed cues, and speed had varying effects at different distances. Training group had minimal effect on learning and all children became significantly better at judging vehicle arrival times following training. CONCLUSIONS: Children tend to underestimate vehicle arrival times. Distance cues are more impactful on time-to-arrival judgments than speed cues, but children's estimations based both on manipulations of vehicle speed and manipulations of vehicle distance improved post-training. Improvements were retained six months later. This finding is consistent with psychophysics research suggesting vehicle approach judgments rely on optical size and looming, which are impacted both by vehicle speeds and distances. Implementation of VR-based training for child pedestrian safety is recommended, as it may improve children's judgment of vehicle time-to-arrival, but it must be conducted cautiously to avoid iatrogenic effects.


Subject(s)
Accidents, Traffic , Pedestrians , Virtual Reality , Humans , Child , Female , Male , Accidents, Traffic/prevention & control , Walking/injuries , Safety , Judgment , Distance Perception
4.
J Exp Psychol Hum Percept Perform ; 50(9): 918-933, 2024 Sep.
Article in English | MEDLINE | ID: mdl-39101929

ABSTRACT

Affective stimuli in our environment indicate reward or threat and thereby relate to approach and avoidance behavior. Previous findings suggest that affective stimuli may bias visual perception, but it remains unclear whether similar biases exist in the auditory domain. Therefore, we asked whether affective auditory voices (angry vs. neutral) influence sound distance perception. Two VR experiments (data collection 2021-2022) were conducted in which auditory stimuli were presented via loudspeakers located at positions unknown to the participants. In the first experiment (N = 44), participants actively placed a visually presented virtual agent or virtual loudspeaker in an empty room at the perceived sound source location. In the second experiment (N = 32), participants were standing in front of several virtual agents or virtual loudspeakers and had to indicate the sound source by directing their gaze toward the perceived sound location. Results in both preregistered experiments consistently showed that participants estimated the location of angry voice stimuli at greater distances than the location of neutral voice stimuli. We discuss that neither emotional nor motivational biases can account for these results. Instead, distance estimates seem to rely on listeners' representations regarding the relationship between vocal affect and acoustic characteristics. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Subject(s)
Affect , Humans , Adult , Female , Male , Young Adult , Affect/physiology , Distance Perception/physiology , Sound Localization/physiology , Voice/physiology , Virtual Reality , Anger/physiology , Auditory Perception/physiology
5.
Vision Res ; 223: 108462, 2024 Oct.
Article in English | MEDLINE | ID: mdl-39111102

ABSTRACT

When observers perceive 3D relations, they represent depth and spatial locations with the ground as a reference. This frame of reference could be egocentric, that is, moving with the observer, or allocentric, that is, remaining stationary and independent of the moving observer. We tested whether the representation of relative depth and of spatial location took an egocentric or allocentric frame of reference in three experiments, using a blind walking task. In Experiments 1 and 2, participants either observed a target in depth, and then straightaway blind walked for the previously seen distance between the target and the self; or walked to the side or along an oblique path for 3 m and then started blind walking for the previously seen distance. The difference between the conditions was whether blind walking started from the observation point. Results showed that blind walking distance varied with the starting locations. Thus, the represented distance did not seem to go through spatial updating with the moving observer and the frame of reference was likely allocentric. In Experiment 3, participants observed a target in space, then immediately blind walked to the target, or blind walked to another starting point and then blind walked to the target. Results showed that the end location of blind walking was different for different starting points, which suggested the representation of spatial location is likely to take an allocentric frame of reference. Taken together, these experiments convergingly suggested that observers used an allocentric frame of reference to construct their mental space representation.


Subject(s)
Depth Perception , Space Perception , Walking , Humans , Male , Space Perception/physiology , Female , Depth Perception/physiology , Adult , Young Adult , Walking/physiology , Analysis of Variance , Distance Perception/physiology
6.
Optom Vis Sci ; 101(6): 393-398, 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38990237

ABSTRACT

SIGNIFICANCE: It is important to know whether early-onset vision loss and late-onset vision loss are associated with differences in the estimation of distances of sound sources within the environment. People with vision loss rely heavily on auditory cues for path planning, safe navigation, avoiding collisions, and activities of daily living. PURPOSE: Loss of vision can lead to substantial changes in auditory abilities. It is unclear whether differences in sound distance estimation exist in people with early-onset partial vision loss, late-onset partial vision loss, and normal vision. We investigated distance estimates for a range of sound sources and auditory environments in groups of participants with early- or late-onset partial visual loss and sighted controls. METHODS: Fifty-two participants heard static sounds with virtual distances ranging from 1.2 to 13.8 m within a simulated room. The room simulated either anechoic (no echoes) or reverberant environments. Stimuli were speech, music, or noise. Single sounds were presented, and participants reported the estimated distance of the sound source. Each participant took part in 480 trials. RESULTS: Analysis of variance showed significant main effects of visual status (p<0.05) environment (reverberant vs. anechoic, p<0.05) and also of the stimulus (p<0.05). Significant differences (p<0.05) were shown in the estimation of distances of sound sources between early-onset visually impaired participants and sighted controls for closer distances for all conditions except the anechoic speech condition and at middle distances for all conditions except the reverberant speech and music conditions. Late-onset visually impaired participants and sighted controls showed similar performance (p>0.05). CONCLUSIONS: The findings suggest that early-onset partial vision loss results in significant changes in judged auditory distance in different environments, especially for close and middle distances. Late-onset partial visual loss has less of an impact on the ability to estimate the distance of sound sources. The findings are consistent with a theoretical framework, the perceptual restructuring hypothesis, which was recently proposed to account for the effects of vision loss on audition.


Subject(s)
Sound Localization , Humans , Male , Female , Middle Aged , Aged , Adult , Sound Localization/physiology , Judgment , Auditory Perception/physiology , Distance Perception/physiology , Acoustic Stimulation/methods , Young Adult , Visual Acuity/physiology , Age of Onset , Aged, 80 and over , Cues
7.
Exp Brain Res ; 242(8): 2023-2031, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38953973

ABSTRACT

The influence of travel time on perceived traveled distance has often been studied, but the results are inconsistent regarding the relationship between the two magnitudes. We argue that this is due to differences in the lengths of investigated travel distances and hypothesize that the influence of travel time differs for rather short compared to rather long traveled distances. We tested this hypothesis in a virtual environment presented on a desktop as well as through a head-mounted display. Our results show that, for longer distances, more travel time leads to longer perceived distance, while we do not find an influence of travel time on shorter distances. The presentation through an HMD vs. desktop only influenced distance judgments in the short distance condition. These results are in line with the idea that the influence of travel time varies by the length of the traveled distance, and provide insights on the question of how distance perception in path integration studies is affected by travel time, thereby resolving inconsistencies reported in previous studies.


Subject(s)
Distance Perception , Humans , Distance Perception/physiology , Female , Male , Young Adult , Adult , Time Factors , Space Perception/physiology , Virtual Reality , Judgment/physiology
8.
Elife ; 122024 Jul 18.
Article in English | MEDLINE | ID: mdl-39023517

ABSTRACT

We reliably judge locations of static objects when we walk despite the retinal images of these objects moving with every step we take. Here, we showed our brains solve this optical illusion by adopting an allocentric spatial reference frame. We measured perceived target location after the observer walked a short distance from the home base. Supporting the allocentric coding scheme, we found the intrinsic bias , which acts as a spatial reference frame for perceiving location of a dimly lit target in the dark, remained grounded at the home base rather than traveled along with the observer. The path-integration mechanism responsible for this can utilize both active and passive (vestibular) translational motion signals, but only along the horizontal direction. This asymmetric path-integration finding in human visual space perception is reminiscent of the asymmetric spatial memory finding in desert ants, pointing to nature's wondrous and logically simple design for terrestrial creatures.


Subject(s)
Distance Perception , Humans , Distance Perception/physiology , Male , Female , Space Perception/physiology , Adult , Young Adult , Optical Illusions/physiology , Visual Perception/physiology
9.
eNeuro ; 11(6)2024 Jun.
Article in English | MEDLINE | ID: mdl-38844346

ABSTRACT

In measurement, a reference frame is needed to compare the measured object to something already known. This raises the neuroscientific question of which reference frame is used by humans when exploring the environment. Previous studies suggested that, in touch, the body employed as measuring tool also serves as reference frame. Indeed, an artificial modification of the perceived dimensions of the body changes the tactile perception of external object dimensions. However, it is unknown if such a change in tactile perception would occur when the body schema is modified through the illusion of owning a limb altered in size. Therefore, employing a virtual hand illusion paradigm with an elongated forearm of different lengths, we systematically tested the subjective perception of distance between two points [tactile distance perception (TDP) task] on the corresponding real forearm following the illusion. Thus, the TDP task is used as a proxy to gauge changes in the body schema. Embodiment of the virtual arm was found significantly greater after the synchronous visuotactile stimulation condition compared with the asynchronous one, and the forearm elongation significantly increased the TDP. However, we did not find any link between the visuotactile-induced ownership over the elongated arm and TDP variation, suggesting that vision plays the main role in the modification of the body schema. Additionally, significant effect of elongation found on TDP but not on proprioception suggests that these are affected differently by body schema modifications. These findings confirm the body schema malleability and its role as a reference frame in touch.


Subject(s)
Distance Perception , Illusions , Touch Perception , Virtual Reality , Humans , Female , Male , Touch Perception/physiology , Young Adult , Adult , Illusions/physiology , Distance Perception/physiology , Proprioception/physiology , Body Image , Forearm/physiology
10.
Autism Res ; 17(7): 1464-1474, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38828663

ABSTRACT

The visual processing differences seen in autism often impede individuals' visual perception of the social world. In particular, many autistic people exhibit poor face recognition. Here, we sought to determine whether autistic adults also show impaired perception of dyadic social interactions-a class of stimulus thought to engage face-like visual processing. Our focus was the perception of interpersonal distance. Participants completed distance change detection tasks, in which they had to make perceptual decisions about the distance between two actors. On half of the trials, participants judged whether the actors moved closer together; on the other half, whether they moved further apart. In a nonsocial control task, participants made similar judgments about two grandfather clocks. We also assessed participants' face recognition ability using standardized measures. The autistic and nonautistic observers showed similar levels of perceptual sensitivity to changes in interpersonal distance when viewing social interactions. As expected, however, the autistic observers showed clear signs of impaired face recognition. Despite putative similarities between the visual processing of faces and dyadic social interactions, our results suggest that these two facets of social vision may dissociate.


Subject(s)
Autistic Disorder , Facial Recognition , Humans , Male , Female , Adult , Autistic Disorder/physiopathology , Autistic Disorder/psychology , Young Adult , Facial Recognition/physiology , Interpersonal Relations , Social Interaction , Social Perception , Visual Perception/physiology , Distance Perception/physiology , Recognition, Psychology/physiology , Adolescent
11.
J Exp Psychol Gen ; 153(8): 2160-2173, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38934948

ABSTRACT

Fitts' Law is one among a small number of psychophysical laws. However, a fundamental variable in Fitts' Law-the movement distance, D-confounds two quantities: The physical distance the effector has to move to reach a goal, and the visually perceived distance to that goal. While these two quantities are functionally equivalent in everyday motor behavior, decoupling them might improve our understanding of the factors that shape speed-accuracy tradeoffs. Here, we leveraged the phenomenon of visuomotor gain adaptation to de-confound movement and visual distance during goal-directed reaching. We found that movement distance and visual distance can influence movement times, supporting a variant of Fitts' Law that considers both. The weighting of movement versus visual distance was modified by restricting movement range and degrading visual feedback. These results may reflect the role of sensory context in early stages of motor planning. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Subject(s)
Psychomotor Performance , Humans , Male , Adult , Psychomotor Performance/physiology , Female , Young Adult , Distance Perception/physiology , Movement/physiology , Visual Perception/physiology , Feedback, Sensory/physiology
12.
PLoS One ; 19(5): e0297442, 2024.
Article in English | MEDLINE | ID: mdl-38728324

ABSTRACT

In the post-epidemic era, the restart of China's inbound tourism is imminent. However, there are gaps in our current understanding of how distance perception dynamically affects inbound tourism in China. In order to understand the past patterns of inbound tourism in China, we mapped the data of 61 countries of origin from 2004 to 2018 into a dynamic expanding gravity model to understand the effects of cultural distance, institutional distance, geographical distance, and economic distance on inbound tourism in China and revealed the dynamic interaction mechanism of non-economic distance perception on inbound tourism in China. Our research results show that cultural distance has a positive impact on China's inbound tourism, while institutional distance has a negative impact. The significant finding is that the dynamic interaction of the above two kinds of perceived distance can still have a positive impact on China's inbound tourism. Its practical significance is that it can counteract the influence of institutional distance by strengthening the cultural distance. Generally speaking, geographical distance and institutional distance restrict China's inbound tourism flow, while cultural distance, economic distance, and interactive perceptual distance promote China's inbound tourism flow.


Subject(s)
Tourism , China , Humans , Models, Theoretical , Distance Perception , Travel/economics , Gravitation
13.
J AAPOS ; 28(3): 103904, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38552945

ABSTRACT

Distance stereoacuity measurement enables the evaluation and management of binocular vision disorders. Here, we compare the results obtained using standard tests for distance stereoacuity measurement with the novel STab test. We tested 87 children (4-17 years of age) using different tests for the quantification of stereopsis at distance: Distance Randot Stereotest (DRS), M&S random dots (M&S), and STab. A strong correlation was demonstrated between M&S-DRS (0.8), M&S-STab (0.81), DRS-STab (0.85) (all P < 0.0001). The limit of agreement between M&S and DRS was 0.45; between M&S and STab, 0.47; and between DRS and STab, 0.38. Our results suggest that all three methods can be used interchangeably.


Subject(s)
Depth Perception , Vision Tests , Vision, Binocular , Visual Acuity , Humans , Child , Child, Preschool , Depth Perception/physiology , Adolescent , Visual Acuity/physiology , Female , Male , Vision Tests/methods , Vision, Binocular/physiology , Distance Perception/physiology
14.
Behav Res Methods ; 56(6): 6198-6222, 2024 09.
Article in English | MEDLINE | ID: mdl-38504080

ABSTRACT

An important aspect of perceptual learning involves understanding how well individuals can perceive distances, sizes, and time-to-contact. Oftentimes, the primary goal in these experiments is to assess participants' errors (i.e., how accurately participants perform these tasks). However, the manner in which researchers have quantified error, or task accuracy, has varied. The use of different measures of task accuracy, to include error scores, ratios, and raw estimates, indicates that the interpretation of findings depends on the measure of task accuracy utilized. In an effort to better understand this issue, we used a Monte Carlo simulation to evaluate five dependent measures of accuracy: raw distance judgments, a ratio of true to estimated distance judgments, relative error, signed error, and absolute error. We simulated data consistent with prior findings in the distance perception literature and evaluated how findings and interpretations vary as a function of the measure of accuracy used. We found there to be differences in both statistical findings (e.g., overall model fit, mean square error, Type I error rate) and the interpretations of those findings. The costs and benefits of utilizing each accuracy measure for quantifying accuracy in distance estimation studies are discussed.


Subject(s)
Distance Perception , Monte Carlo Method , Humans , Distance Perception/physiology , Judgment/physiology , Computer Simulation
15.
Cogn Process ; 25(3): 477-490, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38492094

ABSTRACT

Due to the easing of the pandemic, public policies no longer mandated people to wear masks. People can choose to no wear or wear different types of masks based on personal preferences and safety perceptions during daily interaction. Available information about the influence of face mask type on interpersonal distance (IPD) by different aging populations is still lacking. Thus, this study aimed to investigate the face mask type (no wear, cloth, medical and N95 mask) and age group effect of avatars (children, adults and older adults) on IPD perception, threat feeling and physiological skin conductance response under active and passive approaching. One hundred participants with a range from 20 to 35 years old were recruited for this study. Twelve avatars (three age groups*four face mask conditions) were created and applied in a virtual reality environment. The results showed that age group, mask type and approach mode had significant effects on IPD and subjective threat feeling. A non-significant effect was found on skin conductance responses. Participants maintained a significantly longer IPD when facing the older adults, followed by adults and then children. In the passive approach condition, people tended to maintain a significantly greater comfort distance than during the active approach. For the mask type effect, people kept a significantly largest and shortest IPD when facing an avatar with no mask or the N95 mask, respectively. A non-significant IPD difference was found between the N95 and medical mask. Additionally, based on the subjective threat feeling, facing an avatar wearing a medical mask generated the lowest threat feeling compared to the others. The findings of this study indicated that wearing medical masks provided a benefit in bringing people closer for interaction during specific situations. Understanding that mask-wearing, especially medical one, brought to shortest IPD when compared to the unmasked condition can be utilized to enhance safety measures in crowded public spaces and health-care settings. This information could guide the development of physical distancing recommendations, taking into account both the type of mask and the age groups involved, to ensure the maintenance of appropriate distances.


Subject(s)
Distance Perception , Masks , Social Interaction , Humans , Adult , Male , Female , Young Adult , Distance Perception/physiology , Galvanic Skin Response/physiology , COVID-19 , Age Factors
16.
Mem Cognit ; 52(6): 1439-1450, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38519780

ABSTRACT

The prevailing model of landmark integration in location memory is Maximum Likelihood Estimation, which assumes that each landmark implies a target location distribution that is narrower for more reliable landmarks. This model assumes weighted linear combination of landmarks and predicts that, given optimal integration, the reliability with multiple landmarks is the sum of the reliabilities with the individual landmarks. Super-optimality is reliability with multiple landmarks exceeding optimal reliability given the reliability with each landmark alone; this is shown when performance exceeds predicted optimal performance, found by aggregating reliability values with single landmarks. Past studies claiming super-optimality have provided arguably impure measures of performance with single landmarks given that multiple landmarks were presented at study in conditions with a single landmark at test, disrupting encoding specificity and thereby leading to underestimation in predicted optimal performance. This study, unlike those prior studies, only presented a single landmark at study and the same landmark at test in single landmark trials, showing super-optimality conclusively. Given that super-optimal information integration occurs, emergent information, that is, information only available with multiple landmarks, must be used. With the target and landmarks all in a line, as throughout this study, relative distance is the only emergent information available. Use of relative distance was confirmed here by finding that, when both landmarks are left of the target at study, the target is remembered further right of its true location the further left the left landmark is moved from study to test.


Subject(s)
Spatial Memory , Humans , Spatial Memory/physiology , Young Adult , Adult , Space Perception/physiology , Distance Perception/physiology , Male , Female
17.
Neuropsychologia ; 196: 108838, 2024 04 15.
Article in English | MEDLINE | ID: mdl-38401629

ABSTRACT

To achieve a stable perception of object size in spite of variations in viewing distance, our visual system needs to combine retinal image information and distance cues. Previous research has shown that, not only retinal cues, but also extraretinal sensory signals can provide reliable information about depth and that different neural networks (perception versus action) can exhibit preferences in the use of these different sources of information during size-distance computations. Semantic knowledge of distance, a purely cognitive signal, can also provide distance information. Do the perception and action systems show differences in their ability to use this information in calculating object size and distance? To address this question, we presented 'glow-in-the-dark' objects of different physical sizes at different real distances in a completely dark room. Participants viewed the objects monocularly through a 1-mm pinhole. They either estimated the size and distance of the objects or attempted to grasp them. Semantic knowledge was manipulated by providing an auditory cue about the actual distance of the object: "20 cm", "30 cm", and "40 cm". We found that semantic knowledge of distance contributed to some extent to size constancy operations during perceptual estimation and grasping, but size constancy was never fully restored. Importantly, the contribution of knowledge about distance to size constancy was equivalent between perception and action. Overall, our study reveals similarities and differences between the perception and action systems in the use of semantic distance knowledge and suggests that this cognitive signal is useful but not a reliable depth cue for size constancy under restricted viewing conditions.


Subject(s)
Abnormalities, Multiple , Distance Perception , Humans , Cues , Semantics , Hand Strength , Size Perception , Depth Perception
18.
Hear Res ; 444: 108968, 2024 03 15.
Article in English | MEDLINE | ID: mdl-38350176

ABSTRACT

The perception of the distance to a sound source is relevant in many everyday situations, not only in real spaces, but also in virtual reality (VR) environments. Where real rooms often reach their limits, VR offers far-reaching possibilities to simulate a wide range of acoustic scenarios. However, in virtual room acoustics a plausible reproduction of distance-related cues can be challenging. In the present study, we compared the detection of changes of the distance to a sound source and its neurocognitive correlates in a real and a virtual reverberant environment, using an active auditory oddball paradigm and EEG measures. The main goal was to test whether the experiments in the virtual and real environments produced equivalent behavioral and EEG results. Three loudspeakers were placed at ego-centric distances of 2 m (near), 4 m (center), and 8 m (far) in front of the participants (N = 20), each 66 cm below their ear level. Sequences of 500 ms noise stimuli were presented either from the center position (standards, 80 % of trials) or from the near or far position (targets, 10 % each). The participants had to indicate a target position via a joystick response ("near" or "far"). Sounds were emitted either by real loudspeakers in the real environment or rendered and played back for the corresponding positions via headphones in the virtual environment. In addition, within both environments, loudness of the auditory stimuli was either unaltered (natural loudness) or the loudness cue was manipulated, so that all three loudspeakers were perceived equally loud at the listener's position (matched loudness). The EEG analysis focused on the mismatch negativity (MMN), P3a, and P3b as correlates of deviance detection, attentional orientation, and context-updating/stimulus evaluation, respectively. Overall, behavioral data showed that detection of the target positions was reduced within the virtual environment, and especially when loudness was matched. Except for slight latency shifts in the virtual environment, EEG analysis indicated comparable patterns within both environments and independent of loudness settings. Thus, while the neurocognitive processing of changes in distance appears to be similar in virtual and real spaces, a proper representation of loudness appears to be crucial to achieve a good task performance in virtual acoustic environments.


Subject(s)
Cues , Distance Perception , Humans , Auditory Perception/physiology , Evoked Potentials/physiology , Sound , Acoustic Stimulation , Loudness Perception
19.
Sci Rep ; 14(1): 2656, 2024 02 01.
Article in English | MEDLINE | ID: mdl-38302577

ABSTRACT

Goal-directed approaches to perception usually consider that distance perception is shaped by the body and its potential for interaction. Although this phenomenon has been extensively investigated in the field of perception, little is known about the effect of motor interactions on memory, and how they shape the global representation of large-scale spaces. To investigate this question, we designed an immersive virtual reality environment in which participants had to learn the positions of several items. Half of the participants had to physically (but virtually) grab the items with their hand and drop them at specified locations (active condition). The other half of the participants were simply shown the items which appeared at the specified position without interacting with them (passive condition). Half of the items used during learning were images of manipulable objects, and the other half were non manipulable objects. Participants were subsequently asked to draw a map of the virtual environment from memory, and to position all the items in it. Results show that active participants recalled the global shape of the spatial layout less precisely, and made more absolute distance errors than passive participants. Moreover, global scaling compression bias was higher for active participants than for passive participants. Interestingly, manipulable items showed a greater compression bias compared to non-manipulable items, yet they had no effect on correlation scores and absolute non-directional distance errors. These results are discussed according to grounded approaches of spatial cognition, emphasizing motor simulation as a possible mechanism for position retrieval from memory.


Subject(s)
Learning , Virtual Reality , Humans , Cognition , Mental Recall , Distance Perception , Spatial Memory , Space Perception
20.
Exp Brain Res ; 242(4): 797-808, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38319398

ABSTRACT

The space immediately around the body, referred to as the peripersonal space (PPS), plays a crucial role in interactions with external objects and in avoiding unsafe situations. This study aimed to investigate whether the size of the PPS changes depending on direction, with a particular focus on the disparity between the front and rear spaces. A vibrotactile stimulus was presented to measure PPS while a task-irrelevant auditory stimulus (probe) approached the participant. In addition, to evaluate the effect of the probe, a baseline condition was used in which only tactile stimuli were presented. The results showed that the auditory facilitation effect of the tactile stimulus was greater in the rear condition than in the front condition. Conversely, the performance on tasks related to auditory distance perception and sound speed estimation did not differ between the two directions, indicating that the difference in the auditory facilitation effect between directions cannot be explained by these factors. These findings indicate that the strength of audio-tactile integration is greater in the rear space compared to the front space, suggesting that the representation of the PPS differed between the front and rear spaces.


Subject(s)
Personal Space , Space Perception , Humans , Auditory Perception , Touch , Distance Perception
SELECTION OF CITATIONS
SEARCH DETAIL