Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 39
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Cereb Cortex ; 34(6)2024 Jun 04.
Artículo en Inglés | MEDLINE | ID: mdl-38897817

RESUMEN

Recent work suggests that the adult human brain is very adaptable when it comes to sensory processing. In this context, it has also been suggested that structural "blueprints" may fundamentally constrain neuroplastic change, e.g. in response to sensory deprivation. Here, we trained 12 blind participants and 14 sighted participants in echolocation over a 10-week period, and used MRI in a pre-post design to measure functional and structural brain changes. We found that blind participants and sighted participants together showed a training-induced increase in activation in left and right V1 in response to echoes, a finding difficult to reconcile with the view that sensory cortex is strictly organized by modality. Further, blind participants and sighted participants showed a training induced increase in activation in right A1 in response to sounds per se (i.e. not echo-specific), and this was accompanied by an increase in gray matter density in right A1 in blind participants and in adjacent acoustic areas in sighted participants. The similarity in functional results between sighted participants and blind participants is consistent with the idea that reorganization may be governed by similar principles in the two groups, yet our structural analyses also showed differences between the groups suggesting that a more nuanced view may be required.


Asunto(s)
Corteza Auditiva , Ceguera , Imagen por Resonancia Magnética , Corteza Visual , Humanos , Ceguera/fisiopatología , Ceguera/diagnóstico por imagen , Masculino , Adulto , Femenino , Corteza Auditiva/diagnóstico por imagen , Corteza Auditiva/fisiología , Corteza Auditiva/fisiopatología , Corteza Visual/diagnóstico por imagen , Corteza Visual/fisiología , Adulto Joven , Plasticidad Neuronal/fisiología , Estimulación Acústica , Mapeo Encefálico , Persona de Mediana Edad , Percepción Auditiva/fisiología , Ecolocación/fisiología
2.
Front Rehabil Sci ; 4: 1098624, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37284336

RESUMEN

Click-based echolocation can support mobility and orientation in people with vision impairments (VI) when used alongside other mobility methods. Only a small number of people with VI use click-based echolocation. Previous research about echolocation addresses the skill of echolocation per se to understand how echolocation works, and its brain basis. Our report is the first to address the question of professional practice for people with VI, i.e., a very different focus. VI professionals are well placed to affect how a person with VI might learn about, experience or use click-based echolocation. Thus, we here investigated if training in click-based echolocation for VI professionals might lead to a change in their professional practice. The training was delivered via 6-h workshops throughout the UK. It was free to attend, and people signed up via a publicly available website. We received follow-up feedback in the form of yes/no answers and free text comments. Yes/no answers showed that 98% of participants had changed their professional practice as a consequence of the training. Free text responses were analysed using content analysis, and we found that 32%, 11.7% and 46.6% of responses indicated a change in information processing, verbal influencing or instruction and practice, respectively. This attests to the potential of VI professionals to act as multipliers of training in click-based echolocation with the potential to improve the lives of people with VI. The training we evaluated here could feasibly be integrated into VI Rehabilitation or VI Habilitation training as implemented at higher education institutions (HEIs) or continuing professional development (CPD).

3.
J Exp Psychol Hum Percept Perform ; 49(5): 600-622, 2023 May.
Artículo en Inglés | MEDLINE | ID: mdl-37261769

RESUMEN

It is clear that people can learn a new sensory skill-a new way of mapping sensory inputs onto world states. It remains unclear how flexibly a new sensory skill can become embedded in multisensory perception and decision-making. To address this, we trained typically sighted participants (N = 12) to use a new echo-like auditory cue to distance in a virtual world, together with a noisy visual cue. Using model-based analyses, we tested for key markers of efficient multisensory perception and decision-making with the new skill. We found that 12 of 14 participants learned to judge distance using the novel auditory cue. Their use of this new sensory skill showed three key features: (a) It enhanced the speed of timed decisions; (b) it largely resisted interference from a simultaneous digit span task; and (c) it integrated with vision in a Bayes-like manner to improve precision. We also show some limits following this relatively short training: Precision benefits were lower than the Bayes-optimal prediction, and there was no forced fusion of signals. We conclude that people already embed new sensory skills in flexible multisensory perception and decision-making after a short training period. A key application of these insights is to the development of sensory augmentation systems that can enhance human perceptual abilities in novel ways. The limitations we reveal (sub-optimality, lack of fusion) provide a foundation for further investigations of the limits of these abilities and their brain basis. (PsycInfo Database Record (c) 2023 APA, all rights reserved).


Asunto(s)
Aprendizaje , Percepción Visual , Humanos , Teorema de Bayes , Percepción Auditiva , Estimulación Luminosa
4.
J Neurosci ; 43(24): 4470-4486, 2023 06 14.
Artículo en Inglés | MEDLINE | ID: mdl-37127360

RESUMEN

In the investigation of the brain areas involved in human spatial navigation, the traditional focus has been on visually guided navigation in sighted people. Consequently, it is unclear whether the involved areas also support navigational abilities in other modalities. We explored this possibility by testing whether the occipital place area (OPA), a region associated with visual boundary-based navigation in sighted people, has a similar role in echo-acoustically guided navigation in blind human echolocators. We used fMRI to measure brain activity in 6 blind echolocation experts (EEs; five males, one female), 12 blind controls (BCs; six males, six females), and 14 sighted controls (SCs; eight males, six females) as they listened to prerecorded echolocation sounds that conveyed either a route taken through one of three maze environments, a scrambled (i.e., spatiotemporally incoherent) control sound, or a no-echo control sound. We found significantly greater activity in the OPA of EEs, but not the control groups, when they listened to the coherent route sounds relative to the scrambled sounds. This provides evidence that the OPA of the human navigation brain network is not strictly tied to the visual modality but can be recruited for nonvisual navigation. We also found that EEs, but not BCs or SCs, recruited early visual cortex for processing of echo acoustic information. This is consistent with the recent notion that the human brain is organized flexibly by task rather than by specific modalities.SIGNIFICANCE STATEMENT There has been much research on the brain areas involved in visually guided navigation, but we do not know whether the same or different brain regions are involved when blind people use a sense other than vision to navigate. In this study, we show that one part of the brain (occipital place area) known to play a specific role in visually guided navigation is also active in blind human echolocators when they use reflected sound to navigate their environment. This finding opens up new ways of understanding how people navigate, and informs our ability to provide rehabilitative support to people with vision loss.


Asunto(s)
Ceguera , Ecolocación , Masculino , Animales , Humanos , Femenino , Visión Ocular , Percepción Auditiva , Lóbulo Occipital , Imagen por Resonancia Magnética
5.
Psychol Sci ; 33(7): 1143-1153, 2022 07.
Artículo en Inglés | MEDLINE | ID: mdl-35699555

RESUMEN

Here, we report novel empirical results from a psychophysical experiment in which we tested the echolocation abilities of nine blind adult human experts in click-based echolocation. We found that they had better acuity in localizing a target and used lower intensity emissions (i.e., mouth clicks) when a target was placed 45° off to the side compared with when it was placed at 0° (straight ahead). We provide a possible explanation of the behavioral result in terms of binaural-intensity signals, which appear to change more rapidly around 45°. The finding that echolocators have better echo-localization off axis is surprising, because for human source localization (i.e., regular spatial hearing), it is well known that performance is best when targets are straight ahead (0°) and decreases as targets move farther to the side. This may suggest that human echolocation and source hearing rely on different acoustic cues and that human spatial hearing has more facets than previously thought.


Asunto(s)
Ecolocación , Localización de Sonidos , Adulto , Animales , Señales (Psicología) , Audición , Humanos , Boca
6.
Exp Brain Res ; 239(12): 3625-3633, 2021 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-34609546

RESUMEN

What factors are important in the calibration of mental representations of auditory space? A substantial body of research investigating the audiospatial abilities of people who are blind has shown that visual experience might be an important factor for accurate performance in some audiospatial tasks. Yet, it has also been shown that long-term experience using click-based echolocation might play a similar role, with blind expert echolocators demonstrating auditory localization abilities that are superior to those of people who are blind and who do not use click-based echolocation by Vercillo et al. (Neuropsychologia 67: 35-40, 2015). Based on this hypothesis we might predict that training in click-based echolocation may lead to improvement in performance in auditory localization tasks in people who are blind. Here we investigated this hypothesis in a sample of 12 adult people who have been blind from birth. We did not find evidence for an improvement in performance in auditory localization after 10 weeks of training despite significant improvement in echolocation ability. It is possible that longer-term experience with click-based echolocation is required for effects to develop, or that other factors can explain the association between echolocation expertise and superior auditory localization. Considering the practical relevance of click-based echolocation for people who are visually impaired, future research should address these questions.


Asunto(s)
Ecolocación , Localización de Sonidos , Adulto , Animales , Ceguera , Humanos
7.
PLoS One ; 16(6): e0252330, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34077457

RESUMEN

Understanding the factors that determine if a person can successfully learn a novel sensory skill is essential for understanding how the brain adapts to change, and for providing rehabilitative support for people with sensory loss. We report a training study investigating the effects of blindness and age on the learning of a complex auditory skill: click-based echolocation. Blind and sighted participants of various ages (21-79 yrs; median blind: 45 yrs; median sighted: 26 yrs) trained in 20 sessions over the course of 10 weeks in various practical and virtual navigation tasks. Blind participants also took part in a 3-month follow up survey assessing the effects of the training on their daily life. We found that both sighted and blind people improved considerably on all measures, and in some cases performed comparatively to expert echolocators at the end of training. Somewhat surprisingly, sighted people performed better than those who were blind in some cases, although our analyses suggest that this might be better explained by the younger age (or superior binaural hearing) of the sighted group. Importantly, however, neither age nor blindness was a limiting factor in participants' rate of learning (i.e. their difference in performance from the first to the final session) or in their ability to apply their echolocation skills to novel, untrained tasks. Furthermore, in the follow up survey, all participants who were blind reported improved mobility, and 83% reported better independence and wellbeing. Overall, our results suggest that the ability to learn click-based echolocation is not strongly limited by age or level of vision. This has positive implications for the rehabilitation of people with vision loss or in the early stages of progressive vision loss.


Asunto(s)
Estimulación Acústica , Adaptación Fisiológica , Ceguera/fisiopatología , Aprendizaje , Localización de Sonidos/fisiología , Personas con Daño Visual/psicología , Adulto , Factores de Edad , Anciano , Animales , Fenómenos Biomecánicos , Ceguera/psicología , Femenino , Humanos , Masculino , Persona de Mediana Edad , Factores de Tiempo , Adulto Joven
8.
J Exp Psychol Hum Percept Perform ; 47(2): 269-281, 2021 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-33271045

RESUMEN

Making sense of the world requires perceptual constancy-the stable perception of an object across changes in one's sensation of it. To investigate whether constancy is intrinsic to perception, we tested whether humans can learn a form of constancy that is unique to a novel sensory skill (here, the perception of objects through click-based echolocation). Participants judged whether two echoes were different either because: (a) the clicks were different, or (b) the objects were different. For differences carried through spectral changes (but not level changes), blind expert echolocators spontaneously showed a high constancy ability (mean d' = 1.91) compared to sighted and blind people new to echolocation (mean d' = 0.69). Crucially, sighted controls improved rapidly in this ability through training, suggesting that constancy emerges in a domain with which the perceiver has no prior experience. This provides strong evidence that constancy is intrinsic to human perception. (PsycInfo Database Record (c) 2021 APA, all rights reserved).


Asunto(s)
Ecolocación , Localización de Sonidos , Animales , Ceguera , Humanos , Percepción , Sensación
9.
J Exp Psychol Gen ; 149(12): 2314-2331, 2020 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-32324025

RESUMEN

The human brain may use recent sensory experience to create sensory templates that are then compared to incoming sensory input, that is, "knowing what to listen for." This can lead to greater perceptual sensitivity, as long as the relevant properties of the target stimulus can be reliably estimated from past sensory experiences. Echolocation is an auditory skill probably best understood in bats, but humans can also echolocate. Here we investigated for the first time whether echolocation in humans involves the use of sensory templates derived from recent sensory experiences. Our results showed that when there was certainty in the acoustic properties of the echo relative to the emission, either in temporal onset, spectral content or level, people detected the echo more accurately than when there was uncertainty. In addition, we found that people were more accurate when the emission's spectral content was certain but, surprisingly, not when either its level or temporal onset was certain. Importantly, the lack of an effect of temporal onset of the emission is counter to that found previously for tasks using nonecholocation sounds, suggesting that the underlying mechanisms might be different for echolocation and nonecholocation sounds. Importantly, the effects of stimulus certainty were no different for people with and without experience in echolocation, suggesting that stimulus-specific sensory templates can be used in a skill that people have never used before. From an applied perspective our results suggest that echolocation instruction should encourage users to make clicks that are similar to one another in their spectral content. (PsycInfo Database Record (c) 2020 APA, all rights reserved).


Asunto(s)
Estimulación Acústica/métodos , Percepción Auditiva/fisiología , Ecolocación/fisiología , Incertidumbre , Adulto , Anciano , Animales , Femenino , Humanos , Masculino , Persona de Mediana Edad , Adulto Joven
10.
J Exp Psychol Hum Percept Perform ; 46(1): 21-35, 2020 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-31556685

RESUMEN

People use sensory, in particular visual, information to guide actions such as walking around obstacles, grasping or reaching. However, it is presently unclear how malleable the sensorimotor system is. The present study investigated this by measuring how click-based echolocation may be used to avoid obstacles while walking. We tested 7 blind echolocation experts, 14 sighted, and 10 blind echolocation beginners. For comparison, we also tested 10 sighted participants, who used vision. To maximize the relevance of our research for people with vision impairments, we also included a condition where the long cane was used and considered obstacles at different elevations. Motion capture and sound data were acquired simultaneously. We found that echolocation experts walked just as fast as sighted participants using vision, and faster than either sighted or blind echolocation beginners. Walking paths of echolocation experts indicated early and smooth adjustments, similar to those shown by sighted people using vision and different from later and more abrupt adjustments of beginners. Further, for all participants, the use of echolocation significantly decreased collision frequency with obstacles at head, but not ground level. Further analyses showed that participants who made clicks with higher spectral frequency content walked faster, and that for experts higher clicking rates were associated with faster walking. The results highlight that people can use novel sensory information (here, echolocation) to guide actions, demonstrating the action system's ability to adapt to changes in sensory input. They also highlight that regular use of echolocation enhances sensory-motor coordination for walking in blind people. (PsycINFO Database Record (c) 2019 APA, all rights reserved).


Asunto(s)
Ceguera/fisiopatología , Ceguera/psicología , Localización de Sonidos , Trastornos de la Visión/psicología , Caminata , Adolescente , Adulto , Fenómenos Biomecánicos , Bastones , Femenino , Humanos , Masculino , Trastornos de la Visión/fisiopatología , Caminata/fisiología , Caminata/psicología , Adulto Joven
11.
Proc Biol Sci ; 286(1912): 20191910, 2019 10 09.
Artículo en Inglés | MEDLINE | ID: mdl-31575359

RESUMEN

The functional specializations of cortical sensory areas were traditionally viewed as being tied to specific modalities. A radically different emerging view is that the brain is organized by task rather than sensory modality, but it has not yet been shown that this applies to primary sensory cortices. Here, we report such evidence by showing that primary 'visual' cortex can be adapted to map spatial locations of sound in blind humans who regularly perceive space through sound echoes. Specifically, we objectively quantify the similarity between measured stimulus maps for sound eccentricity and predicted stimulus maps for visual eccentricity in primary 'visual' cortex (using a probabilistic atlas based on cortical anatomy) to find that stimulus maps for sound in expert echolocators are directly comparable to those for vision in sighted people. Furthermore, the degree of this similarity is positively related with echolocation ability. We also rule out explanations based on top-down modulation of brain activity-e.g. through imagery. This result is clear evidence that task-specific organization can extend even to primary sensory cortices, and in this way is pivotal in our reinterpretation of the functional organization of the human brain.


Asunto(s)
Ceguera , Mapeo Encefálico , Localización de Sonidos , Animales , Ecolocación , Humanos , Lóbulo Parietal , Sonido , Visión Ocular , Corteza Visual , Personas con Daño Visual
12.
Cognition ; 193: 104014, 2019 12.
Artículo en Inglés | MEDLINE | ID: mdl-31302529

RESUMEN

Cue combination occurs when two independent noisy perceptual estimates are merged together as a weighted average, creating a unified estimate that is more precise than either single estimate alone. Surprisingly, this effect has not been demonstrated compellingly in children under the age of 10 years, in contrast with the array of other multisensory skills that children show even in infancy. Instead, across a wide variety of studies, precision with both cues is no better than the best single cue - and sometimes worse. Here we provide the first consistent evidence of cue combination in children from 7 to 10 years old. Across three experiments, participants showed evidence of a bimodal precision advantage (Experiments 1a and 1b) and the majority were best-fit by a combining model (Experiment 2). The task was to localize a target horizontally with a binaural audio cue and a noisy visual cue in immersive virtual reality. Feedback was given as well, which could both (a) help participants judge how reliable each cue is and (b) help correct between-cue biases that might prevent cue combination. Crucially, our results show cue combination when feedback is only given on single cues - therefore, combination itself was not a strategy learned via feedback. We suggest that children at 7-10 years old are capable of cue combination in principle, but must have sufficient representations of reliabilities and biases in their own perceptual estimates as relevant to the task, which can be facilitated through task-specific feedback.


Asunto(s)
Adaptación Psicológica/fisiología , Percepción Auditiva/fisiología , Señales (Psicología) , Retroalimentación Psicológica/fisiología , Percepción Visual/fisiología , Niño , Femenino , Humanos , Masculino , Realidad Virtual
13.
J Assoc Res Otolaryngol ; 20(5): 499-510, 2019 10.
Artículo en Inglés | MEDLINE | ID: mdl-31286299

RESUMEN

Some people who are blind have trained themselves in echolocation using mouth clicks. Here, we provide the first report of psychophysical and clicking data during echolocation of distance from a group of 8 blind people with experience in mouth click-based echolocation (daily use for > 3 years). We found that experienced echolocators can detect changes in distance of 3 cm at a reference distance of 50 cm, and a change of 7 cm at a reference distance of 150 cm, regardless of object size (i.e. 28.5 cm vs. 80 cm diameter disk). Participants made mouth clicks that were more intense and they made more clicks for weaker reflectors (i.e. same object at farther distance, or smaller object at same distance), but number and intensity of clicks were adjusted independently from one another. The acuity we found is better than previous estimates based on samples of sighted participants without experience in echolocation or individual experienced participants (i.e. single blind echolocators tested) and highlights adaptation of the perceptual system in blind human echolocators. Further, the dynamic adaptive clicking behaviour we observed suggests that number and intensity of emissions serve separate functions to increase SNR. The data may serve as an inspiration for low-cost (i.e. non-array based) artificial 'cognitive' sonar and radar systems, i.e. signal design, adaptive pulse repetition rate and intensity. It will also be useful for instruction and guidance for new users of echolocation.


Asunto(s)
Ceguera/psicología , Localización de Sonidos/fisiología , Adulto , Animales , Umbral Auditivo , Femenino , Humanos , Masculino , Persona de Mediana Edad , Psicofísica
14.
Neuropsychologia ; 128: 150-165, 2019 05.
Artículo en Inglés | MEDLINE | ID: mdl-29753019

RESUMEN

Patients with injury to early visual cortex or its inputs can display the Riddoch phenomenon: preserved awareness for moving but not stationary stimuli. We provide a detailed case report of a patient with the Riddoch phenomenon, MC. MC has extensive bilateral lesions to occipitotemporal cortex that include most early visual cortex and complete blindness in visual field perimetry testing with static targets. Nevertheless, she shows a remarkably robust preserved ability to perceive motion, enabling her to navigate through cluttered environments and perform actions like catching moving balls. Comparisons of MC's structural magnetic resonance imaging (MRI) data to a probabilistic atlas based on controls reveals that MC's lesions encompass the posterior, lateral, and ventral early visual cortex bilaterally (V1, V2, V3A/B, LO1/2, TO1/2, hV4 and VO1 in both hemispheres) as well as more extensive damage to right parietal (inferior parietal lobule) and left ventral occipitotemporal cortex (VO1, PHC1/2). She shows some sparing of anterior occipital cortex, which may account for her ability to see moving targets beyond ~15 degrees eccentricity during perimetry. Most strikingly, functional and structural MRI revealed robust and reliable spared functionality of the middle temporal motion complex (MT+) bilaterally. Moreover, consistent with her preserved ability to discriminate motion direction in psychophysical testing, MC also shows direction-selective adaptation in MT+. A variety of tests did not enable us to discern whether input to MT+ was driven by her spared anterior occipital cortex or subcortical inputs. Nevertheless, MC shows rich motion perception despite profoundly impaired static and form vision, combined with clear preservation of activation in MT+, thus supporting the role of MT+ in the Riddoch phenomenon.


Asunto(s)
Ceguera Cortical/diagnóstico por imagen , Ceguera Cortical/psicología , Percepción de Movimiento , Corteza Visual/patología , Mapeo Encefálico , Infarto Cerebral/patología , Infarto Cerebral/psicología , Sensibilidad de Contraste , Discriminación en Psicología , Femenino , Humanos , Imagen por Resonancia Magnética , Persona de Mediana Edad , Neuroimagen , Psicofísica , Percepción Visual
15.
Sci Rep ; 8(1): 16880, 2018 11 15.
Artículo en Inglés | MEDLINE | ID: mdl-30442895

RESUMEN

Humans are effective at dealing with noisy, probabilistic information in familiar settings. One hallmark of this is Bayesian Cue Combination: combining multiple noisy estimates to increase precision beyond the best single estimate, taking into account their reliabilities. Here we show that adults also combine a novel audio cue to distance, akin to human echolocation, with a visual cue. Following two hours of training, subjects were more precise given both cues together versus the best single cue. This persisted when we changed the novel cue's auditory frequency. Reliability changes also led to a re-weighting of cues without feedback, showing that they learned something more flexible than a rote decision rule for specific stimuli. The main findings replicated with a vibrotactile cue. These results show that the mature sensory apparatus can learn to flexibly integrate new sensory skills. The findings are unexpected considering previous empirical results and current models of multisensory learning.


Asunto(s)
Visión Ocular/fisiología , Percepción Visual/fisiología , Adulto , Percepción Auditiva , Teorema de Bayes , Sesgo , Señales (Psicología) , Femenino , Humanos , Masculino , Estimulación Luminosa , Tacto/fisiología , Incertidumbre , Vibración , Adulto Joven
16.
PLoS Comput Biol ; 13(8): e1005670, 2017 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-28859082

RESUMEN

Echolocation is the ability to use sound-echoes to infer spatial information about the environment. Some blind people have developed extraordinary proficiency in echolocation using mouth-clicks. The first step of human biosonar is the transmission (mouth click) and subsequent reception of the resultant sound through the ear. Existing head-related transfer function (HRTF) data bases provide descriptions of reception of the resultant sound. For the current report, we collected a large database of click emissions with three blind people expertly trained in echolocation, which allowed us to perform unprecedented analyses. Specifically, the current report provides the first ever description of the spatial distribution (i.e. beam pattern) of human expert echolocation transmissions, as well as spectro-temporal descriptions at a level of detail not available before. Our data show that transmission levels are fairly constant within a 60° cone emanating from the mouth, but levels drop gradually at further angles, more than for speech. In terms of spectro-temporal features, our data show that emissions are consistently very brief (~3ms duration) with peak frequencies 2-4kHz, but with energy also at 10kHz. This differs from previous reports of durations 3-15ms and peak frequencies 2-8kHz, which were based on less detailed measurements. Based on our measurements we propose to model transmissions as sum of monotones modulated by a decaying exponential, with angular attenuation by a modified cardioid. We provide model parameters for each echolocator. These results are a step towards developing computational models of human biosonar. For example, in bats, spatial and spectro-temporal features of emissions have been used to derive and test model based hypotheses about behaviour. The data we present here suggest similar research opportunities within the context of human echolocation. Relatedly, the data are a basis to develop synthetic models of human echolocation that could be virtual (i.e. simulated) or real (i.e. loudspeaker, microphones), and which will help understanding the link between physical principles and human behaviour.


Asunto(s)
Ceguera/rehabilitación , Ecolocación/fisiología , Modelos Biológicos , Localización de Sonidos/fisiología , Adulto , Animales , Bases de Datos Factuales , Humanos , Masculino , Persona de Mediana Edad , Boca/fisiología , Procesamiento de Señales Asistido por Computador , Espectrografía del Sonido
17.
Wiley Interdiscip Rev Cogn Sci ; 7(6): 382-393, 2016 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-27538733

RESUMEN

Bats and dolphins are known for their ability to use echolocation. They emit bursts of sounds and listen to the echoes that bounce back to detect the objects in their environment. What is not as well-known is that some blind people have learned to do the same thing, making mouth clicks, for example, and using the returning echoes from those clicks to sense obstacles and objects of interest in their surroundings. The current review explores some of the research that has examined human echolocation and the changes that have been observed in the brains of echolocation experts. We also discuss potential applications and assistive technology based on echolocation. Blind echolocation experts can sense small differences in the location of objects, differentiate between objects of various sizes and shapes, and even between objects made of different materials, just by listening to the reflected echoes from mouth clicks. It is clear that echolocation may enable some blind people to do things that are otherwise thought to be impossible without vision, potentially providing them with a high degree of independence in their daily lives and demonstrating that echolocation can serve as an effective mobility strategy in the blind. Neuroimaging has shown that the processing of echoes activates brain regions in blind echolocators that would normally support vision in the sighted brain, and that the patterns of these activations are modulated by the information carried by the echoes. This work is shedding new light on just how plastic the human brain is. WIREs Cogn Sci 2016, 7:382-393. doi: 10.1002/wcs.1408 For further resources related to this article, please visit the WIREs website.


Asunto(s)
Ceguera/fisiopatología , Ceguera/psicología , Encéfalo/fisiología , Ecolocación/fisiología , Localización de Sonidos/fisiología , Animales , Quirópteros , Humanos , Neuroimagen , Plasticidad Neuronal , Percepción Espacial/fisiología
18.
PLoS One ; 11(5): e0154868, 2016.
Artículo en Inglés | MEDLINE | ID: mdl-27135407

RESUMEN

Echolocation is the ability to use reflected sound to obtain information about the spatial environment. Echolocation is an active process that requires both the production of the emission as well as the sensory processing of the resultant sound. Appreciating the general usefulness of echo-acoustic cues for people, in particular those with vision impairments, various devices have been built that exploit the principle of echolocation to obtain and provide information about the environment. It is common to all these devices that they do not require the person to make a sound. Instead, the device produces the emission autonomously and feeds a resultant sound back to the user. Here we tested if echolocation performance in a simple object detection task was affected by the use of a head-mounted loudspeaker as compared to active clicking. We found that 27 sighted participants new to echolocation did generally better when they used a loudspeaker as compared to mouth-clicks, and that two blind participants with experience in echolocation did equally well with mouth clicks and the speaker. Importantly, performance of sighted participants' was not statistically different from performance of blind experts when they used the speaker. Based on acoustic click data collected from a subset of our participants, those participants whose mouth clicks were more similar to the speaker clicks, and thus had higher peak frequencies and sound intensity, did better. We conclude that our results are encouraging for the consideration and development of assistive devices that exploit the principle of echolocation.


Asunto(s)
Ecolocación/fisiología , Estimulación Acústica , Adulto , Animales , Ceguera , Femenino , Humanos , Masculino , Psicoacústica , Localización de Sonidos/fisiología , Personas con Daño Visual , Adulto Joven
19.
Neuropsychologia ; 80: 79-89, 2016 Jan 08.
Artículo en Inglés | MEDLINE | ID: mdl-26586155

RESUMEN

It is still an open question if the auditory system, similar to the visual system, processes auditory motion independently from other aspects of spatial hearing, such as static location. Here, we report psychophysical data from a patient (female, 42 and 44 years old at the time of two testing sessions), who suffered a bilateral occipital infarction over 12 years earlier, and who has extensive damage in the occipital lobe bilaterally, extending into inferior posterior temporal cortex bilaterally and into right parietal cortex. We measured the patient's spatial hearing ability to discriminate static location, detect motion and perceive motion direction in both central (straight ahead), and right and left peripheral auditory space (50° to the left and right of straight ahead). Compared to control subjects, the patient was impaired in her perception of direction of auditory motion in peripheral auditory space, and the deficit was more pronounced on the right side. However, there was no impairment in her perception of the direction of auditory motion in central space. Furthermore, detection of motion and discrimination of static location were normal in both central and peripheral space. The patient also performed normally in a wide battery of non-spatial audiological tests. Our data are consistent with previous neuropsychological and neuroimaging results that link posterior temporal cortex and parietal cortex with the processing of auditory motion. Most importantly, however, our data break new ground by suggesting a division of auditory motion processing in terms of speed and direction and in terms of central and peripheral space.


Asunto(s)
Trastornos de la Percepción Auditiva/fisiopatología , Encéfalo/patología , Percepción de Movimiento/fisiología , Percepción Espacial/fisiología , Estimulación Acústica , Adulto , Encéfalo/irrigación sanguínea , Mapeo Encefálico , Femenino , Humanos , Procesamiento de Imagen Asistido por Computador , Imagen por Resonancia Magnética , Oxígeno/sangre , Estimulación Luminosa , Psicofísica , Campos Visuales
20.
Multisens Res ; 28(1-2): 195-226, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-26152058

RESUMEN

Echolocation can be used by blind and sighted humans to navigate their environment. The current study investigated the neural activity underlying processing of path direction during walking. Brain activity was measured with fMRI in three blind echolocation experts, and three blind and three sighted novices. During scanning, participants listened to binaural recordings that had been made prior to scanning while echolocation experts had echolocated during walking along a corridor which could continue to the left, right, or straight ahead. Participants also listened to control sounds that contained ambient sounds and clicks, but no echoes. The task was to decide if the corridor in the recording continued to the left, right, or straight ahead, or if they were listening to a control sound. All participants successfully dissociated echo from no echo sounds, however, echolocation experts were superior at direction detection. We found brain activations associated with processing of path direction (contrast: echo vs. no echo) in superior parietal lobule (SPL) and inferior frontal cortex in each group. In sighted novices, additional activation occurred in the inferior parietal lobule (IPL) and middle and superior frontal areas. Within the framework of the dorso-dorsal and ventro-dorsal pathway proposed by Rizzolatti and Matelli (2003), our results suggest that blind participants may automatically assign directional meaning to the echoes, while sighted participants may apply more conscious, high-level spatial processes. High similarity of SPL and IFC activations across all three groups, in combination with previous research, also suggest that all participants recruited a multimodal spatial processing system for action (here: locomotion).


Asunto(s)
Percepción Auditiva/fisiología , Ceguera/fisiopatología , Localización de Sonidos/fisiología , Percepción Espacial/fisiología , Caminata , Estimulación Acústica/métodos , Adolescente , Adulto , Ceguera/diagnóstico , Ceguera/rehabilitación , Humanos , Imagen por Resonancia Magnética , Masculino , Adulto Joven
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA