Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 27
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Multisens Res ; 37(1): 75-88, 2023 Dec 20.
Artículo en Inglés | MEDLINE | ID: mdl-38118461

RESUMEN

While navigating through the surroundings, we constantly rely on inertial vestibular signals for self-motion along with visual and acoustic spatial references from the environment. However, the interaction between inertial cues and environmental spatial references is not yet fully understood. Here we investigated whether vestibular self-motion sensitivity is influenced by sensory spatial references. Healthy participants were administered a Vestibular Self-Motion Detection Task in which they were asked to detect vestibular self-motion sensations induced by low-intensity Galvanic Vestibular Stimulation. Participants performed this detection task with or without an external visual or acoustic spatial reference placed directly in front of them. We computed the d prime ( d ' ) as a measure of participants' vestibular sensitivity and the criterion as an index of their response bias. Results showed that the visual spatial reference increased sensitivity to detect vestibular self-motion. Conversely, the acoustic spatial reference did not influence self-motion sensitivity. Both visual and auditory spatial references did not cause changes in response bias. Environmental visual spatial references provide relevant information to enhance our ability to perceive inertial self-motion cues, suggesting a specific interaction between visual and vestibular systems in self-motion perception.


Asunto(s)
Percepción de Movimiento , Percepción Espacial , Vestíbulo del Laberinto , Humanos , Percepción de Movimiento/fisiología , Masculino , Vestíbulo del Laberinto/fisiología , Femenino , Adulto , Adulto Joven , Percepción Espacial/fisiología , Señales (Psicología) , Percepción Visual/fisiología , Estimulación Acústica , Percepción Auditiva/fisiología
2.
Eur J Neurosci ; 58(9): 4034-4042, 2023 11.
Artículo en Inglés | MEDLINE | ID: mdl-37688501

RESUMEN

Determining the spatial relation between objects and our location in the surroundings is essential for survival. Vestibular inputs provide key information about the position and movement of our head in the three-dimensional space, contributing to spatial navigation. Yet, their role in encoding spatial localisation of environmental targets remains to be fully understood. We probed the accuracy and precision of healthy participants' representations of environmental space by measuring their ability to encode the spatial location of visual targets (Experiment 1). Participants were asked to detect a visual light and then walk towards it. Vestibular signalling was artificially disrupted using stochastic galvanic vestibular stimulation (sGVS) applied selectively during encoding targets' location. sGVS impaired the accuracy and precision of locating the environmental visual targets. Importantly, this effect was specific to the visual modality. The location of acoustic targets was not influenced by vestibular alterations (Experiment 2). Our findings indicate that the vestibular system plays a role in localising visual targets in the surrounding environment, suggesting a crucial functional interaction between vestibular and visual signals for the encoding of the spatial relationship between our body position and the surrounding objects.


Asunto(s)
Percepción Espacial , Vestíbulo del Laberinto , Humanos , Percepción Espacial/fisiología , Vestíbulo del Laberinto/fisiología , Sensación , Movimiento
3.
Cyberpsychol Behav Soc Netw ; 26(8): 648-656, 2023 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-37389855

RESUMEN

The Cartesian coordinate system is a fundamental concept for mathematics and science and poses a teaching challenge at primary school level. Learning the Cartesian coordinate system has the potential to promote numerical cognition through number-space associations, as well as core geometric concepts, including isometric transformations, symmetry, and shape perception. Immersive virtual reality (VR) facilitates embodied forms of teaching and learning mathematics through whole-body sensorimotor interaction and offers benefits as a platform to learn the Cartesian coordinate system compared with "real world" classroom activities. Our goal was to validate the Cartesian-Garden, a serious game designed to provide an educationally robust but engaging vehicle to teach these concepts in primary-level mathematics in a multisensory VR environment. In the game, the child explores a Cartesian-Garden, that is, a field of flowers in which each flower corresponds to x and y coordinates. Specifically, we tested whether exploring numbers spatially represented improved spatial and numerical skills independently from the use of VR. Children (n = 49; age 7-11 years old) were divided into experimental and age-matched control groups. The experimental group explored the Cartesian-Garden and picked flowers corresponding to target coordinates; the control group played a VR game unrelated to Cartesian coordinates. To quantify potential improvements, children were tested before and after training with perceptual tests investigating number line and spatial thinking. The results point toward differential age-related improvements depending on the tested concept, especially for the number line. This study provides the guidelines for the successful use of the Cartesian-Garden game, beneficial for specific age groups.


Asunto(s)
Aprendizaje , Realidad Virtual , Niño , Humanos , Cognición , Motivación , Matemática
4.
Atten Percept Psychophys ; 84(8): 2670-2683, 2022 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-36261764

RESUMEN

Vestibular cues are crucial to sense the linear and angular acceleration of our head in three-dimensional space. Previous literature showed that vestibular information precociously combines with other sensory modalities, such as proprioceptive and visual, to facilitate spatial navigation. Recent studies suggest that auditory cues may improve self-motion perception as well. The present study investigated the ability to estimate passive rotational displacements with and without virtual acoustic landmarks to determine how vestibular and auditory information interact in processing self-motion information. We performed two experiments. In both, healthy participants sat on a Rotational-Translational Chair. They experienced yaw rotations along the earth-vertical axis and performed a self-motion discrimination task. Their goal was to estimate both clockwise and counterclockwise rotations' amplitude, with no visual information available, reporting whether they felt to be rotated more or less than 45°. According to the condition, vestibular-only or audio-vestibular information was present. Between the two experiments, we manipulated the procedure of presentation of the auditory cues (passive vs. active production of sounds). We computed the point of subjective equality (PSE) as a measure of accuracy and the just noticeable difference (JND) as the precision of the estimations for each condition and direction of rotations. Results in both experiments show a strong overestimation bias of the rotations, regardless of the condition, the direction, and the sound generation conditions. Similar to previously found heading biases, this bias in rotation estimation may facilitate the perception of substantial deviations from the most relevant directions in daily navigation activities.


Asunto(s)
Percepción de Movimiento , Vestíbulo del Laberinto , Humanos , Propiocepción , Movimiento (Física) , Sesgo , Percepción Espacial
5.
Vision (Basel) ; 6(3)2022 Aug 26.
Artículo en Inglés | MEDLINE | ID: mdl-36136746

RESUMEN

Perceptual biases can be interpreted as adverse consequences of optimal processes which otherwise improve system performance. The review presented here focuses on the investigation of inaccuracies in multisensory perception by focusing on the perception of verticality and self-motion, where the vestibular sensory modality has a prominent role. Perception of verticality indicates how the system processes gravity. Thus, it represents an indirect measurement of vestibular perception. Head tilts can lead to biases in perceived verticality, interpreted as the influence of a vestibular prior set at the most common orientation relative to gravity (i.e., upright), useful for improving precision when upright (e.g., fall avoidance). Studies on the perception of verticality across development and in the presence of blindness show that prior acquisition is mediated by visual experience, thus unveiling the fundamental role of visuo-vestibular interconnections across development. Such multisensory interactions can be behaviorally tested with cross-modal aftereffect paradigms which test whether adaptation in one sensory modality induces biases in another, eventually revealing an interconnection between the tested sensory modalities. Such phenomena indicate the presence of multisensory neural mechanisms that constantly function to calibrate self-motion dedicated sensory modalities with each other as well as with the environment. Thus, biases in vestibular perception reveal how the brain optimally adapts to environmental requests, such as spatial navigation and steady changes in the surroundings.

6.
Front Psychol ; 13: 784188, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35686077

RESUMEN

Spatial memory relies on encoding, storing, and retrieval of knowledge about objects' positions in their surrounding environment. Blind people have to rely on sensory modalities other than vision to memorize items that are spatially displaced, however, to date, very little is known about the influence of early visual deprivation on a person's ability to remember and process sound locations. To fill this gap, we tested sighted and congenitally blind adults and adolescents in an audio-spatial memory task inspired by the classical card game "Memory." In this research, subjects (blind, n = 12; sighted, n = 12) had to find pairs among sounds (i.e., animal calls) displaced on an audio-tactile device composed of loudspeakers covered by tactile sensors. To accomplish this task, participants had to remember the spatialized sounds' position and develop a proper mental spatial representation of their locations. The test was divided into two experimental conditions of increasing difficulty dependent on the number of sounds to be remembered (8 vs. 24). Results showed that sighted participants outperformed blind participants in both conditions. Findings were discussed considering the crucial role of visual experience in properly manipulating auditory spatial representations, particularly in relation to the ability to explore complex acoustic configurations.

8.
J Exp Psychol Hum Percept Perform ; 48(2): 174-189, 2022 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-35225632

RESUMEN

When moving through space, we encode multiple sensory cues that guide our orientation through the environment. The integration between visual and self-motion cues is known to improve navigation. However, spatial navigation may also benefit from multisensory external signals. The present study aimed to investigate whether humans combine auditory and visual landmarks with improving their navigation abilities. Two experiments with different cue reliability were conducted. In both, participants' task was to return an object to its original location by using landmarks, which could be visual-only, auditory-only, or audiovisual. We took error and variability of object relocation distance as measures of accuracy and precision. To quantify interference between cues and assess their weights, we ran a conflict condition with a spatial discrepancy between visual and auditory landmarks. Results showed comparable accuracy and precision when navigating with visual-only and audiovisual landmarks but greater error and variability with auditory-only landmarks. Splitting participants into two groups based on given unimodal weights revealed that only subjects who associated similar weights to auditory and visual cues showed precision benefit in audiovisual conditions. These findings suggest that multisensory integration occurs depending on idiosyncratic cue weighting. Future multisensory procedures to aid mobility must consider individual differences in encoding landmarks. (PsycInfo Database Record (c) 2022 APA, all rights reserved).


Asunto(s)
Navegación Espacial , Percepción Auditiva , Señales (Psicología) , Humanos , Reproducibilidad de los Resultados , Percepción Visual
9.
Neuropsychology ; 36(1): 55-63, 2022 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-34647755

RESUMEN

OBJECTIVE: This study investigates how spatial working memory skills, and the processing and retrieval of distal auditory spatial information are influenced by visual experience. METHOD: We developed an experimental paradigm using an acoustic simulation. The performance of congenitally blind and sighted participants (n = 9 per group) was compared when recalling sequences of spatialised auditory items in the same or reverse order of presentation. Two experimental conditions based on stimuli features were tested: non-semantic and semantic. RESULTS: Blind participants had a shorter memory span in the backward than the forward order of presentation. In contrast, sighted participants did not, revealing that blindness affects spatial information processing with greater executive source involvement. Furthermore, we found that blind subjects performed worse overall than the sighted group and that the semantic information significantly improved the performance, regardless of the experimental group and the sequences' order of presentation. CONCLUSIONS: Lack of early visual experience affects the ability to encode the surrounding space. Congenital blindness influences the processing and retrieval of spatial auditory items, suggesting that visual experience plays a pivotal role in calibrating spatial memory abilities using the remaining sensory modalities. (PsycInfo Database Record (c) 2022 APA, all rights reserved).


Asunto(s)
Memoria a Corto Plazo , Memoria Espacial , Estimulación Acústica , Ceguera , Humanos , Recuerdo Mental , Visión Ocular
10.
PLoS One ; 16(12): e0260700, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34905544

RESUMEN

Working memory is a cognitive system devoted to storage and retrieval processing of information. Numerous studies on the development of working memory have investigated the processing of visuo-spatial and verbal non-spatialized information; however, little is known regarding the refinement of acoustic spatial and memory abilities across development. Here, we hypothesize that audio-spatial memory skills improve over development, due to strengthening spatial and cognitive skills such as semantic elaboration. We asked children aged 6 to 11 years old (n = 55) to pair spatialized animal calls with the corresponding animal spoken name. Spatialized sounds were emitted from an audio-haptic device, haptically explored by children with the dominant hand's index finger. Children younger than 8 anchored their exploration strategy on previously discovered sounds instead of holding this information in working memory and performed worse than older peers when asked to pair the spoken word with the corresponding animal call. In line with our hypothesis, these findings demonstrate that age-related improvements in spatial exploration and verbal coding memorization strategies affect how children learn and memorize items belonging to a complex acoustic spatial layout. Similar to vision, audio-spatial memory abilities strongly depend on cognitive development in early years of life.


Asunto(s)
Cognición/fisiología , Memoria a Corto Plazo/fisiología , Patrones de Reconocimiento Fisiológico/fisiología , Memoria Espacial/fisiología , Factores de Edad , Animales , Niño , Perros , Femenino , Interfaces Hápticas , Humanos , Masculino , Semántica , Vocalización Animal/fisiología
11.
Sci Rep ; 11(1): 17959, 2021 09 27.
Artículo en Inglés | MEDLINE | ID: mdl-34580325

RESUMEN

The acquisition of postural control is an elaborate process, which relies on the balanced integration of multisensory inputs. Current models suggest that young children rely on an 'en-block' control of their upper body before sequentially acquiring a segmental control around the age of 7, and that they resort to the former strategy under challenging conditions. While recent works suggest that a virtual sensory environment alters visuomotor integration in healthy adults, little is known about the effects on younger individuals. Here we show that this default coordination pattern is disrupted by an immersive virtual reality framework where a steering role is assigned to the trunk, which causes 6- to 8-year-olds to employ an ill-adapted segmental strategy. These results provide an alternate trajectory of motor development and emphasize the immaturity of postural control at these ages.


Asunto(s)
Cabeza/fisiología , Equilibrio Postural/fisiología , Desempeño Psicomotor/fisiología , Realidad Virtual , Niño , Femenino , Humanos , Masculino , Torso/fisiología
13.
Neuropsychologia ; 154: 107774, 2021 04 16.
Artículo en Inglés | MEDLINE | ID: mdl-33600832

RESUMEN

Sensory cues enable navigation through space, as they inform us about movement properties, such as the amount of travelled distance and the heading direction. In this study, we focused on the ability to spatially update one's position when only proprioceptive and vestibular information is available. We aimed to investigate the effect of yaw rotation on path integration across development in the absence of visual feedback. To this end, we utilized the triangle completion task: participants were guided through two legs of a triangle and asked to close the shape by walking along its third imagined leg. To test the influence of yaw rotation across development, we tested children between 6 and 11 years old (y.o.) and adults on their perceptions of angles of different degrees. Our results demonstrated that the amount of turn while executing the angle influences performance at all ages, and in some aspects, also interacted with age. Indeed, whilst adults seemed to adjust their heading towards the end of their walked path, younger children took less advantage of this strategy. The amount of disorientation the path induced also affected participants' full maturational ability to spatially navigate with no visual feedback. Increasing induced disorientation required children to be older to reach adult-level performance. Overall, these results provide novel insights on the maturation of spatial navigation-related processes.


Asunto(s)
Navegación Espacial , Adulto , Niño , Señales (Psicología) , Humanos , Propiocepción , Rotación , Percepción Espacial , Caminata
14.
Annu Int Conf IEEE Eng Med Biol Soc ; 2020: 3318-3322, 2020 07.
Artículo en Inglés | MEDLINE | ID: mdl-33018714

RESUMEN

Vestibular perception is useful to maintain heading direction and successful spatial navigation. In this study, we present a novel equipment capable of delivering both rotational and translational movements, namely the RT-Chair. The system comprises two motors and it is controlled by the user via MATLAB. To validate the measurability of vestibular perception with the RT-chair, we ran a threshold measurement experiment with healthy participants. Our results show thresholds comparable to previous literature, thus confirming the validity of the system to measure vestibular perception.


Asunto(s)
Percepción de Movimiento , Navegación Espacial , Vestíbulo del Laberinto , Cabeza , Humanos , Movimiento (Física)
15.
Front Psychol ; 10: 2068, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-31572264

RESUMEN

Developmental studies have shown that children can associate visual size with non-visual and apparently unrelated stimuli, such as pure tone frequencies. Most research to date has focused on audio-visual size associations by showing that children can associate low pure tone frequencies with large objects, and high pure tone frequencies with small objects. Researchers relate these findings to coarser association, i.e., less precise associations for which binary categories of stimuli are used such as in the case of low versus high frequencies and large versus small visual stimuli. This study investigates how finer, more precise, crossmodal audio-visual associations develop during primary school age (from 6 to 11 years old). To unveil such patterns, we took advantage of a range of auditory pure tones and tested how primary school children match sounds with visually presented shapes. We tested 66 children (6-11 years old) in an audio-visual matching task involving a range of pure tone frequencies. Visual stimuli were circles or angles of different sizes. We asked participants to indicate the shape matching the sound. All children associated large objects/angles with low pitch, and small objects/angles with high pitch sounds. Interestingly, older children made greater use of intermediate visual sizes to provide their responses. Indeed, audio-visual associations for finer differences between stimulus features such as size and pure tone frequencies, may develop later depending on the maturation of supramodal size perception processes. Considering our results, we suggest that audio-visual size correspondences can be used for educational purposes by aiding the discrimination of sizes, including angles of different aperture. Moreover, their use should be shaped according to children's specific developmental stage.

16.
Front Neurol ; 10: 321, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-31024422

RESUMEN

Dynamic visual acuity (DVA) provides an overall functional measure of visual stabilization performance that depends on the vestibulo-ocular reflex (VOR), but also on other processes, including catch-up saccades and likely visual motion processing. Capturing the efficiency of gaze stabilization against head movement as a whole, it is potentially valuable in the clinical context where assessment of overall patient performance provides an important indication of factors impacting patient participation and quality of life. DVA during head rotation (rDVA) has been assessed previously, but to our knowledge, DVA during horizontal translation (tDVA) has not been measured. tDVA can provide a valuable measure of how otolith, rather than canal, function impacts visual acuity. In addition, comparison of DVA during rotation and translation can shed light on whether common factors are limiting DVA performance in both cases. We therefore measured and compared DVA during both passive head rotations (head impulse test) and translations in the same set of healthy subjects (n = 7). In addition to DVA, we computed average VOR gain and retinal slip within and across subjects. We observed that during translation, VOR gain was reduced (VOR during rotation, mean ± SD: position gain = 1.05 ± 0.04, velocity gain = 0.97 ± 0.07; VOR during translation, mean ± SD: position gain = 0.21 ± 0.08, velocity gain = 0.51 ± 0.16), retinal slip was increased, and tDVA was worse than during rotation (average rDVA = 0.32 ± 0.15 logMAR; average tDVA = 0.56 ± 0.09 logMAR, p = 0.02). This suggests that reduced VOR gain leads to worse tDVA, as expected. We conclude with speculation about non-oculomotor factors that could vary across individuals and affect performance similarly during both rotation and translation.

17.
Sci Rep ; 8(1): 13393, 2018 09 06.
Artículo en Inglés | MEDLINE | ID: mdl-30190584

RESUMEN

Spatial memory is a multimodal representation of the environment, which can be mediated by different sensory signals. Here we investigate how the auditory modality influences memorization, contributing to the mental representation of a scene. We designed an audio test inspired by a validated spatial memory test, the Corsi-Block test for blind individuals. The test was carried out in two different conditions, with non-semantic and semantic stimuli, presented in different sessions and displaced on an audio-tactile device. Furthermore, the semantic sounds were spatially displaced in order to reproduce an audio scene, explored by participants during the test. Thus, we verified if semantic rather than non-semantic sounds are better recalled and whether exposure to an auditory scene can enhance memorization skills. Our results show that sighted subjects performed better than blind participants after the exploration of the semantic scene. This suggests that blind participants focus on the perceived sound positions and do not use items' locations learned during the exploration. We discuss these results in terms of the role of visual experience on spatial memorization skills and the ability to take advantage of semantic information stored in the memory.


Asunto(s)
Percepción Auditiva , Ceguera/fisiopatología , Recuerdo Mental , Memoria Espacial , Adulto , Femenino , Humanos , Masculino , Persona de Mediana Edad , Semántica
18.
Front Neurol ; 9: 1151, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-30666230

RESUMEN

Investigation of the perception of verticality permits to disclose the perceptual mechanisms that underlie balance control and spatial navigation. Estimation of verticality in unusual body orientation with respect to gravity (e.g., laterally tilted in the roll plane) leads to biases that change depending on the encoding sensory modality and the amount of tilt. A well-known phenomenon is the A-effect, that is a bias toward the body tilt often interpreted in a Bayesian framework to be the byproduct of a prior peaked at the most common head and body orientation, i.e., upright. In this study, we took advantage of this phenomenon to study the interaction of visual, haptic sensory information with vestibular/proprioceptive priors across development. We tested children (5-13 y.o) and adults (>22 y.o.) in an orientation discrimination task laterally tilted 90° to their left-ear side. Experimental conditions differed for the tested sensory modality: visual-only, haptic-only, both modalities. Resulting accuracy depended on the developmental stage and the encoding sensory modality, showing A-effects in vision across all ages and in the haptic modality only for the youngest children whereas bimodal judgments show lack of multisensory integration in children. A Bayesian prior model nicely predicts the behavioral data when the peak of the prior distribution shifts across age groups. Our results suggest that vision is pivotal to acquire an idiotropic vector useful for improving precision when upright. The acquisition of such a prior might be related to the development of head and trunk coordination, a process that is fundamental for gaining successful spatial navigation.

19.
Front Neurosci ; 11: 687, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-29270109

RESUMEN

The orientation of the body in space can influence perception of verticality leading sometimes to biases consistent with priors peaked at the most common head and body orientation, that is upright. In this study, we investigate haptic perception of verticality in sighted individuals and early and late blind adults when tilted counterclockwise in the roll plane. Participants were asked to perform a stimulus orientation discrimination task with their body tilted to their left ear side 90° relative to gravity. Stimuli were presented by using a motorized haptic bar. In order to test whether different reference frames relative to the head influenced perception of verticality, we varied the position of the stimulus on the body longitudinal axis. Depending on the stimulus position sighted participants tended to have biases away or toward their body tilt. Visually impaired individuals instead show a different pattern of verticality estimations. A bias toward head and body tilt (i.e., Aubert effect) was observed in late blind individuals. Interestingly, no strong biases were observed in early blind individuals. Overall, these results posit visual sensory information to be fundamental in influencing the haptic readout of proprioceptive and vestibular information about body orientation relative to gravity. The acquisition of an idiotropic vector signaling the upright might take place through vision during development. Regarding early blind individuals, independent spatial navigation experience likely enhanced by echolocation behavior might have a role in such acquisition. In the case of participants with late onset blindness, early experience of vision might lead them to anchor their visually acquired priors to the haptic modality with no disambiguation between head and body references as observed in sighted individuals (Fraser et al., 2015). With our study, we aim to investigate haptic perception of gravity direction in unusual body tilts when vision is absent due to visual impairment. Insofar, our findings throw light on the influence of proprioceptive/vestibular sensory information on haptic perceived verticality in blind individuals showing how this phenomenon is affected by visual experience.

20.
Front Neurosci ; 11: 594, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-29114201

RESUMEN

Size perception can be influenced by several visual cues, such as spatial (e.g., depth or vergence) and temporal contextual cues (e.g., adaptation to steady visual stimulation). Nevertheless, perception is generally multisensory and other sensory modalities, such as auditory, can contribute to the functional estimation of the size of objects. In this study, we investigate whether auditory stimuli at different sound pitches can influence visual size perception after visual adaptation. To this aim, we used an adaptation paradigm (Pooresmaeili et al., 2013) in three experimental conditions: visual-only, visual-sound at 100 Hz and visual-sound at 9,000 Hz. We asked participants to judge the size of a test stimulus in a size discrimination task. First, we obtained a baseline for all conditions. In the visual-sound conditions, the auditory stimulus was concurrent to the test stimulus. Secondly, we repeated the task by presenting an adapter (twice as big as the reference stimulus) before the test stimulus. We replicated the size aftereffect in the visual-only condition: the test stimulus was perceived smaller than its physical size. The new finding is that we found the auditory stimuli have an effect on the perceived size of the test stimulus after visual adaptation: low frequency sound decreased the effect of visual adaptation, making the stimulus perceived bigger compared to the visual-only condition, and contrarily, the high frequency sound had the opposite effect, making the test size perceived even smaller.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA