Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 16 de 16
Filtrar
1.
Optom Vis Sci ; 101(6): 393-398, 2024 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-38990237

RESUMEN

SIGNIFICANCE: It is important to know whether early-onset vision loss and late-onset vision loss are associated with differences in the estimation of distances of sound sources within the environment. People with vision loss rely heavily on auditory cues for path planning, safe navigation, avoiding collisions, and activities of daily living. PURPOSE: Loss of vision can lead to substantial changes in auditory abilities. It is unclear whether differences in sound distance estimation exist in people with early-onset partial vision loss, late-onset partial vision loss, and normal vision. We investigated distance estimates for a range of sound sources and auditory environments in groups of participants with early- or late-onset partial visual loss and sighted controls. METHODS: Fifty-two participants heard static sounds with virtual distances ranging from 1.2 to 13.8 m within a simulated room. The room simulated either anechoic (no echoes) or reverberant environments. Stimuli were speech, music, or noise. Single sounds were presented, and participants reported the estimated distance of the sound source. Each participant took part in 480 trials. RESULTS: Analysis of variance showed significant main effects of visual status (p<0.05) environment (reverberant vs. anechoic, p<0.05) and also of the stimulus (p<0.05). Significant differences (p<0.05) were shown in the estimation of distances of sound sources between early-onset visually impaired participants and sighted controls for closer distances for all conditions except the anechoic speech condition and at middle distances for all conditions except the reverberant speech and music conditions. Late-onset visually impaired participants and sighted controls showed similar performance (p>0.05). CONCLUSIONS: The findings suggest that early-onset partial vision loss results in significant changes in judged auditory distance in different environments, especially for close and middle distances. Late-onset partial visual loss has less of an impact on the ability to estimate the distance of sound sources. The findings are consistent with a theoretical framework, the perceptual restructuring hypothesis, which was recently proposed to account for the effects of vision loss on audition.


Asunto(s)
Localización de Sonidos , Humanos , Masculino , Femenino , Persona de Mediana Edad , Anciano , Adulto , Localización de Sonidos/fisiología , Juicio , Percepción Auditiva/fisiología , Percepción de Distancia/fisiología , Estimulación Acústica/métodos , Adulto Joven , Agudeza Visual/fisiología , Edad de Inicio , Anciano de 80 o más Años , Señales (Psicología)
2.
Exp Brain Res ; 240(1): 81-96, 2022 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-34623459

RESUMEN

Visual spatial information plays an important role in calibrating auditory space. Blindness results in deficits in a number of auditory abilities, which have been explained in terms of the hypothesis that visual information is needed to calibrate audition. When judging the size of a novel room when only auditory cues are available, normally sighted participants may use the location of the farthest sound source to infer the nearest possible distance of the far wall. However, for people with partial visual loss (distinct from blindness in that some vision is present), such a strategy may not be reliable if vision is needed to calibrate auditory cues for distance. In the current study, participants were presented with sounds at different distances (ranging from 1.2 to 13.8 m) in a simulated reverberant (T60 = 700 ms) or anechoic room. Farthest distance judgments and room size judgments (volume and area) were obtained from blindfolded participants (18 normally sighted, 38 partially sighted) for speech, music, and noise stimuli. With sighted participants, the judged room volume and farthest sound source distance estimates were positively correlated (p < 0.05) for all conditions. Participants with visual losses showed no significant correlations for any of the conditions tested. A similar pattern of results was observed for the correlations between farthest distance and room floor area estimates. Results demonstrate that partial visual loss disrupts the relationship between judged room size and sound source distance that is shown by sighted participants.


Asunto(s)
Ceguera , Localización de Sonidos , Estimulación Acústica , Percepción Auditiva , Señales (Psicología) , Humanos , Sonido
3.
Perception ; 50(7): 646-663, 2021 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-34053354

RESUMEN

When vision is unavailable, auditory level and reverberation cues provide important spatial information regarding the environment, such as the size of a room. We investigated how room-size estimates were affected by stimulus type, level, and reverberation. In Experiment 1, 15 blindfolded participants estimated room size after performing a distance bisection task in virtual rooms that were either anechoic (with level cues only) or reverberant (with level and reverberation cues) with a relatively short reverberation time of T60 = 400 milliseconds. Speech, noise, or clicks were presented at distances between 1.9 and 7.1 m. The reverberant room was judged to be significantly larger than the anechoic room (p < .05) for all stimuli. In Experiment 2, only the reverberant room was used and the overall level of all sounds was equalized, so only reverberation cues were available. Ten blindfolded participants took part. Room-size estimates were significantly larger for speech than for clicks or noise. The results show that when level and reverberation cues are present, reverberation increases judged room size. Even relatively weak reverberation cues provide room-size information, which could potentially be used by blind or visually impaired individuals encountering novel rooms.


Asunto(s)
Localización de Sonidos , Estimulación Acústica , Señales (Psicología) , Humanos , Ruido , Sonido
4.
Exp Brain Res ; 235(2): 597-606, 2017 02.
Artículo en Inglés | MEDLINE | ID: mdl-27837259

RESUMEN

Compared to sighted listeners, blind listeners often display enhanced auditory spatial abilities such as localization in azimuth. However, less is known about whether blind humans can accurately judge distance in extrapersonal space using auditory cues alone. Using virtualization techniques, we show that auditory spatial representations of the world beyond the peripersonal space of blind listeners are compressed compared to those for normally sighted controls. Blind participants overestimated the distance to nearby sources and underestimated the distance to remote sound sources, in both reverberant and anechoic environments, and for speech, music, and noise signals. Functions relating judged and actual virtual distance were well fitted by compressive power functions, indicating that the absence of visual information regarding the distance of sound sources may prevent accurate calibration of the distance information provided by auditory signals.


Asunto(s)
Ceguera/fisiopatología , Discriminación en Psicología , Localización de Sonidos/fisiología , Percepción Espacial/fisiología , Estimulación Acústica , Adulto , Anciano , Señales (Psicología) , Femenino , Lateralidad Funcional , Humanos , Juicio , Masculino , Persona de Mediana Edad , Adulto Joven
5.
Exp Brain Res ; 232(3): 975-84, 2014 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-24370580

RESUMEN

The study assessed the ability of the central nervous system (CNS) to use echoic information from sensory substitution devices (SSDs) to rotate the shoulders and safely pass through apertures of different width. Ten visually normal participants performed this task with full vision, or blindfolded using an SSD to obtain information regarding the width of an aperture created by two parallel panels. Two SSDs were tested. Participants passed through apertures of +0, +18, +35 and +70 % of measured body width. Kinematic indices recorded movement time, shoulder rotation, average walking velocity across the trial, peak walking velocities before crossing, after crossing and throughout a whole trial. Analyses showed participants used SSD information to regulate shoulder rotation, with greater rotation associated with narrower apertures. Rotations made using an SSD were greater compared to vision, movement times were longer, average walking velocity lower and peak velocities before crossing, after crossing and throughout the whole trial were smaller, suggesting greater caution. Collisions sometimes occurred using an SSD but not using vision, indicating that substituted information did not always result in accurate shoulder rotation judgements. No differences were found between the two SSDs. The data suggest that spatial information, provided by sensory substitution, allows the relative position of aperture panels to be internally represented, enabling the CNS to modify shoulder rotation according to aperture width. Increased buffer space indicated by greater rotations (up to approximately 35 % for apertures of +18 % of body width) suggests that spatial representations are not as accurate as offered by full vision.


Asunto(s)
Movimiento/fisiología , Desempeño Psicomotor/fisiología , Localización de Sonidos , Percepción Espacial/fisiología , Caminata/fisiología , Estimulación Acústica , Adolescente , Adulto , Fenómenos Biomecánicos , Femenino , Humanos , Masculino , Rotación , Hombro/inervación , Factores de Tiempo , Adulto Joven
6.
Forensic Sci Int ; 361: 112103, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-38901059

RESUMEN

In the forensic science context petrol is considered the most common fire accelerant. However, the identification and classification of petrol sources through the years has been proven to be a challenge in the investigation of fire related incidents. This research explored the possibility of identification and classification of petrol sources using high field NMR spectroscopy. In this study, 1H NMR profiling, using specific pulse sequences to analyse neat aliquot petrol samples of different brands collected at different times across the UK and Ireland is shown, for the first time, to provide a diagnostic 'fingerprint' with specific chemical compounds that can be used for identification and classification of petrol samples. This enables linkage of unknown petrol samples to a source and in addition provides a tool which allows exclusion of potential petrol sources. A new, innovative method using 1H selTOCSY is described for the individualization and classification of petrol samples through the identification of olefinic markers in the samples. Those markers were identified as (i) 3-methyl-1-butene, (ii) a mixture of 1-pentene and 3-methyl-1-butene, (iii) 2-methyl-2-butene and (iv) a mixture of cis and trans-2-pentene.

7.
Exp Brain Res ; 224(4): 623-33, 2013 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-23178908

RESUMEN

Totally blind listeners often demonstrate better than normal capabilities when performing spatial hearing tasks. Accurate representation of three-dimensional auditory space requires the processing of available distance information between the listener and the sound source; however, auditory distance cues vary greatly depending upon the acoustic properties of the environment, and it is not known which distance cues are important to totally blind listeners. Our data show that totally blind listeners display better performance compared to sighted age-matched controls for distance discrimination tasks in anechoic and reverberant virtual rooms simulated using a room-image procedure. Totally blind listeners use two major auditory distance cues to stationary sound sources, level and direct-to-reverberant ratio, more effectively than sighted controls for many of the virtual distances tested. These results show that significant compensation among totally blind listeners for virtual auditory spatial distance leads to benefits across a range of simulated acoustic environments. No significant differences in performance were observed between listeners with partial non-correctable visual losses and sighted controls, suggesting that sensory compensation for virtual distance does not occur for listeners with partial vision loss.


Asunto(s)
Percepción Auditiva/fisiología , Ceguera/psicología , Señales (Psicología) , Discriminación en Psicología , Percepción de Distancia/fisiología , Estimulación Acústica , Acústica , Adulto , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , Sonido , Interfaz Usuario-Computador , Adulto Joven
8.
J Acoust Soc Am ; 134(5): 3395-8, 2013 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-24180748

RESUMEN

The study investigated how listeners used level and direct-to-reverberant ratio (D/R) cues to discriminate distances to virtual sound sources. Sentence pairs were presented at virtual distances in simulated rooms that were either reverberant or anechoic. Performance on the basis of level was generally better than performance based on D/R. Increasing room reverberation time improved performance based on the D/R cue such that the two cues provided equally effective information at further virtual source distances in highly reverberant environments. Orientation of the listener within the virtual room did not affect performance.


Asunto(s)
Señales (Psicología) , Discriminación en Psicología , Localización de Sonidos , Percepción del Habla , Estimulación Acústica , Adulto , Audiometría del Habla , Arquitectura y Construcción de Instituciones de Salud , Femenino , Humanos , Masculino , Orientación , Psicoacústica , Percepción Espacial , Vibración , Adulto Joven
9.
J Neurol ; 269(12): 6678-6684, 2022 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-35907045

RESUMEN

Clinical neurophysiology constructs a wealth of dynamic information pertaining to the integrity and function of both central and peripheral nervous systems. As with many technological fields, there has been an explosion of data in neurophysiology over recent years, and this requires considerable analysis by experts. Computational algorithms and especially advances in machine learning (ML) have the ability to assist with this task and potentially reveal hidden insights. In this update article, we will provide a brief overview where such technology is being applied in clinical neurophysiology and possible future directions.


Asunto(s)
Aprendizaje Automático , Neurofisiología , Humanos , Algoritmos
10.
Sci Rep ; 10(1): 7169, 2020 04 28.
Artículo en Inglés | MEDLINE | ID: mdl-32346036

RESUMEN

Blindness leads to substantial enhancements in many auditory abilities, and deficits in others. It is unknown how severe visual losses need to be before changes in auditory abilities occur, or whether the relationship between severity of visual loss and changes in auditory abilities is proportional and systematic. Here we show that greater severity of visual loss is associated with increased auditory judgments of distance and room size. On average participants with severe visual losses perceived sounds to be twice as far away, and rooms to be three times larger, than sighted controls. Distance estimates for sighted controls were most accurate for closer sounds and least accurate for farther sounds. As the severity of visual impairment increased, accuracy decreased for closer sounds and increased for farther sounds. However, it is for closer sounds that accurate judgments are needed to guide rapid motor responses to auditory events, e.g. planning a safe path through a busy street to avoid collisions with other people, and falls. Interestingly, greater visual impairment severity was associated with more accurate room size estimates. The results support a new hypothesis that crossmodal calibration of audition by vision depends on the severity of visual loss.


Asunto(s)
Localización de Sonidos , Percepción Espacial , Baja Visión/fisiopatología , Adolescente , Adulto , Femenino , Humanos , Juicio , Masculino
11.
Sci Rep ; 10(1): 6279, 2020 04 14.
Artículo en Inglés | MEDLINE | ID: mdl-32286362

RESUMEN

Although vision is important for calibrating auditory spatial perception, it only provides information about frontal sound sources. Previous studies of blind and sighted people support the idea that azimuthal spatial bisection in frontal space requires visual calibration, while detection of a change in azimuth (minimum audible angle, MAA) does not. The influence of vision on the ability to map frontal, lateral and back space has not been investigated. Performance in spatial bisection and MAA tasks was assessed for normally sighted blindfolded subjects using bursts of white noise presented frontally, laterally, or from the back relative to the subjects. Thresholds for both tasks were similar in frontal space, lower for the MAA task than for the bisection task in back space, and higher for the MAA task in lateral space. Two interpretations of the results are discussed, one in terms of visual calibration and the use of internal representations of source location and the other based on comparison of the magnitude or direction of change of the available binaural cues. That bisection thresholds were increased in back space relative to front space, where visual calibration information is unavailable, suggests that an internal representation of source location was used for the bisection task.


Asunto(s)
Localización de Sonidos , Percepción Espacial , Percepción Visual , Adulto , Femenino , Voluntarios Sanos , Humanos , Masculino , Persona de Mediana Edad
12.
Heliyon ; 4(2): e00526, 2018 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-29560446

RESUMEN

This paper proposes the use of a non-immersive virtual reality rehabilitation system "ReHabgame" developed using Microsoft Kinect™ and the Thalmic™ Labs Myo gesture control armband. The ReHabgame was developed based on two third-person video games that provide a feasible possibility of assessing postural control and functional reach tests. It accurately quantifies specific postural control mechanisms including timed standing balance, functional reach tests using real-time anatomical landmark orientation, joint velocity, and acceleration while end trajectories were calculated using an inverse kinematics algorithm. The game was designed to help patients with neurological impairment to be subjected to physiotherapy activity and practice postures of daily activities. The subjective experience of the ReHabgame was studied through the development of an Engagement Questionnaire (EQ) for qualitative, quantitative and Rasch model. The Monte-Carlo Tree Search (MCTS) and Random object (ROG) generator algorithms were used to adapt the physical and gameplay intensity in the ReHabgame based on the Motor Assessment Scale (MAS) and Hierarchical Scoring System (HSS). Rasch analysis was conducted to assess the psychometric characteristics of the ReHabgame and to identify if these are any misfitting items in the game. Rasch rating scale model (RSM) was used to assess the engagement of players in the ReHabgame and evaluate the effectiveness and attractiveness of the game. The results showed that the scales assessing the rehabilitation process met Rasch expectations of reliability, and unidimensionality. Infit and outfit mean squares values are in the range of (0.68-1.52) for all considered 16 items. The Root Mean Square Residual (RMSR) and the person separation reliability were acceptable. The item/person map showed that the persons and items were clustered symmetrically.

13.
Front Psychol ; 8: 561, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-28446890

RESUMEN

We assessed how visually impaired (VI) people perceived their own auditory abilities using an established hearing questionnaire, the Speech, Spatial, and Qualities of Hearing Scale (SSQ), that was adapted to make it relevant and applicable to VI individuals by removing references to visual aspects while retaining the meaning of the original questions. The resulting questionnaire, the SSQvi, assessed perceived hearing ability in diverse situations including the ability to follow conversations with multiple speakers, assessing how far away a vehicle is, and the ability to perceptually segregate simultaneous sounds. The SSQvi was administered to 33 VI and 33 normally sighted participants. All participants had normal hearing or mild hearing loss, and all VI participants had some residual visual ability. VI participants gave significantly higher (better) scores than sighted participants for: (i) one speech question, indicating less difficulty in following a conversation that switches from one person to another, (ii) one spatial question, indicating less difficulty in localizing several talkers, (iii) three qualities questions, indicating less difficulty with segregating speech from music, hearing music more clearly, and better speech intelligibility in a car. These findings are consistent with the perceptual enhancement hypothesis, that certain auditory abilities are improved to help compensate for loss of vision, and show that full visual loss is not necessary for perceived changes in auditory ability to occur for a range of auditory situations. For all other questions, scores were not significantly different between the two groups. Questions related to effort, concentration, and ignoring distracting sounds were rated as most difficult for VI participants, as were situations involving divided-attention contexts with multiple streams of speech, following conversations in noise and in echoic environments, judging elevation or distance, and externalizing sounds. The questionnaire has potential clinical applications in assessing the success of clinical interventions and setting more realistic goals for intervention for those with auditory and/or visual losses. The results contribute toward providing benchmark scores for VI individuals.

14.
Atten Percept Psychophys ; 78(2): 373-95, 2016 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-26590050

RESUMEN

Auditory distance perception plays a major role in spatial awareness, enabling location of objects and avoidance of obstacles in the environment. However, it remains under-researched relative to studies of the directional aspect of sound localization. This review focuses on the following four aspects of auditory distance perception: cue processing, development, consequences of visual and auditory loss, and neurological bases. The several auditory distance cues vary in their effective ranges in peripersonal and extrapersonal space. The primary cues are sound level, reverberation, and frequency. Nonperceptual factors, including the importance of the auditory event to the listener, also can affect perceived distance. Basic internal representations of auditory distance emerge at approximately 6 months of age in humans. Although visual information plays an important role in calibrating auditory space, sensorimotor contingencies can be used for calibration when vision is unavailable. Blind individuals often manifest supranormal abilities to judge relative distance but show a deficit in absolute distance judgments. Following hearing loss, the use of auditory level as a distance cue remains robust, while the reverberation cue becomes less effective. Previous studies have not found evidence that hearing-aid processing affects perceived auditory distance. Studies investigating the brain areas involved in processing different acoustic distance cues are described. Finally, suggestions are given for further research on auditory distance perception, including broader investigation of how background noise and multiple sound sources affect perceived auditory distance for those with sensory loss.


Asunto(s)
Enfermedades Auditivas Centrales/fisiopatología , Vías Auditivas/fisiología , Percepción Auditiva/fisiología , Ceguera/fisiopatología , Señales (Psicología) , Percepción de Distancia/fisiología , Pérdida Auditiva/fisiopatología , Estimulación Acústica , Audífonos , Humanos
15.
Hear Res ; 310: 60-8, 2014 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-24524865

RESUMEN

There is currently considerable interest in the consequences of loss in one sensory modality on the remaining senses. Much of this work has focused on the development of enhanced auditory abilities among blind individuals, who are often able to use sound to navigate through space. It has now been established that many blind individuals produce sound emissions and use the returning echoes to provide them with information about objects in their surroundings, in a similar manner to bats navigating in the dark. In this review, we summarize current knowledge regarding human echolocation. Some blind individuals develop remarkable echolocation abilities, and are able to assess the position, size, distance, shape, and material of objects using reflected sound waves. After training, normally sighted people are also able to use echolocation to perceive objects, and can develop abilities comparable to, but typically somewhat poorer than, those of blind people. The underlying cues and mechanisms, operable range, spatial acuity and neurological underpinnings of echolocation are described. Echolocation can result in functional real life benefits. It is possible that these benefits can be optimized via suitable training, especially among those with recently acquired blindness, but this requires further study. Areas for further research are identified.


Asunto(s)
Ceguera/fisiopatología , Ecolocación/fisiología , Adaptación Fisiológica , Animales , Ceguera/psicología , Corteza Cerebral/irrigación sanguínea , Corteza Cerebral/fisiopatología , Quirópteros/fisiología , Neuroimagen Funcional , Humanos , Imagen por Resonancia Magnética , Oxígeno/sangre , Psicoacústica , Conducta Espacial/fisiología
16.
Perception ; 42(9): 985-90, 2013.
Artículo en Inglés | MEDLINE | ID: mdl-24386717

RESUMEN

Blind participants greatly rely on sound for spatial information regarding the surrounding environment. It is not yet established whether lack of vision to calibrate audition in far space affects blind participants' internal spatial representation of acoustic room size. Furthermore, blind participants may rely more on farthest distance estimates to sound sources compared with sighted participants when perceiving room size. Here we show that judgments of apparent room size and sound distance are correlated, more so for blind than for sighted participants. Sighted participants judged a reverberant virtual room to be larger for speech than for music or noise stimuli, whereas blind participants did not. The results suggest that blindness affects the use of room reverberation for distance and room-size judgments.


Asunto(s)
Estimulación Acústica/métodos , Acústica , Percepción Auditiva/fisiología , Ceguera/psicología , Percepción Espacial/fisiología , Adulto , Anciano , Femenino , Humanos , Juicio/fisiología , Masculino , Persona de Mediana Edad , Ruido , Localización de Sonidos/fisiología , Factores de Tiempo , Adulto Joven
SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda