Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 20
Filtrar
Más filtros












Base de datos
Intervalo de año de publicación
1.
Emotion ; 2024 Feb 26.
Artículo en Inglés | MEDLINE | ID: mdl-38407120

RESUMEN

The ability to reliably discriminate vocal expressions of emotion is crucial to engage in successful social interactions. This process is arguably more crucial for blind individuals, since they cannot extract social information from faces and bodies, and therefore chiefly rely on voices to infer the emotional state of their interlocutors. Blind have demonstrated superior abilities in several aspects of auditory perception, but research on their ability to discriminate vocal features is still scarce and has provided unclear results. Here, we used a gating psychophysical paradigm to test whether early blind people would differ from individually matched sighted controls at the recognition of emotional expressions. Surprisingly, blind people showed lower performance than controls in discriminating specific vocal emotions. We presented segments of nonlinguistic emotional vocalizations of increasing duration (100-400 ms), portraying five basic emotions (fear, happy, sad, disgust, and angry), and we asked our participants for an explicit emotion categorization task. We then calculated sensitivity indices and confusion patterns of their performance. We observed better performance of the sighted group in the discrimination of angry and fearful expression, with no between-group differences for other emotions. This result supports the view that vision plays a calibrating role for specific threat-related emotions specifically. (PsycInfo Database Record (c) 2024 APA, all rights reserved).

2.
J Neurosci ; 42(23): 4652-4668, 2022 06 08.
Artículo en Inglés | MEDLINE | ID: mdl-35501150

RESUMEN

hMT+/V5 is a region in the middle occipitotemporal cortex that responds preferentially to visual motion in sighted people. In cases of early visual deprivation, hMT+/V5 enhances its response to moving sounds. Whether hMT+/V5 contains information about motion directions and whether the functional enhancement observed in the blind is motion specific, or also involves sound source location, remains unsolved. Moreover, the impact of this cross-modal reorganization of hMT+/V5 on the regions typically supporting auditory motion processing, like the human planum temporale (hPT), remains equivocal. We used a combined functional and diffusion-weighted MRI approach and individual in-ear recordings to study the impact of early blindness on the brain networks supporting spatial hearing in male and female humans. Whole-brain univariate analysis revealed that the anterior portion of hMT+/V5 responded to moving sounds in sighted and blind people, while the posterior portion was selective to moving sounds only in blind participants. Multivariate decoding analysis revealed that the presence of motion direction and sound position information was higher in hMT+/V5 and lower in hPT in the blind group. While both groups showed axis-of-motion organization in hMT+/V5 and hPT, this organization was reduced in the hPT of blind people. Diffusion-weighted MRI revealed that the strength of hMT+/V5-hPT connectivity did not differ between groups, whereas the microstructure of the connections was altered by blindness. Our results suggest that the axis-of-motion organization of hMT+/V5 does not depend on visual experience, but that congenital blindness alters the response properties of occipitotemporal networks supporting spatial hearing in the sighted.SIGNIFICANCE STATEMENT Spatial hearing helps living organisms navigate their environment. This is certainly even more true in people born blind. How does blindness affect the brain network supporting auditory motion and sound source location? Our results show that the presence of motion direction and sound position information was higher in hMT+/V5 and lower in human planum temporale in blind relative to sighted people; and that this functional reorganization is accompanied by microstructural (but not macrostructural) alterations in their connections. These findings suggest that blindness alters cross-modal responses between connected areas that share the same computational goals.


Asunto(s)
Mapeo Encefálico , Percepción de Movimiento , Percepción Auditiva/fisiología , Ceguera , Femenino , Humanos , Imagen por Resonancia Magnética/métodos , Masculino , Percepción de Movimiento/fisiología
3.
Psychol Sci ; 31(9): 1129-1139, 2020 09.
Artículo en Inglés | MEDLINE | ID: mdl-32846109

RESUMEN

Vision is thought to support the development of spatial abilities in the other senses. If this is true, how does spatial hearing develop in people lacking visual experience? We comprehensively addressed this question by investigating auditory-localization abilities in 17 congenitally blind and 17 sighted individuals using a psychophysical minimum-audible-angle task that lacked sensorimotor confounds. Participants were asked to compare the relative position of two sound sources located in central and peripheral, horizontal and vertical, or frontal and rear spaces. We observed unequivocal enhancement of spatial-hearing abilities in congenitally blind people, irrespective of the field of space that was assessed. Our results conclusively demonstrate that visual experience is not a prerequisite for developing optimal spatial-hearing abilities and that, in striking contrast, the lack of vision leads to a general enhancement of auditory-spatial skills.


Asunto(s)
Localización de Sonidos , Personas con Daño Visual , Ceguera , Audición , Humanos , Percepción Espacial , Visión Ocular
4.
Curr Biol ; 30(12): 2289-2299.e8, 2020 06 22.
Artículo en Inglés | MEDLINE | ID: mdl-32442465

RESUMEN

The human occipito-temporal region hMT+/V5 is well known for processing visual motion direction. Here, we demonstrate that hMT+/V5 also represents the direction of auditory motion in a format partially aligned with the one used to code visual motion. We show that auditory and visual motion directions can be reliably decoded in individually localized hMT+/V5 and that motion directions in one modality can be predicted from the activity patterns elicited by the other modality. Despite shared motion-direction information across the senses, vision and audition, however, overall produce opposite voxel-wise responses in hMT+/V5. Our results reveal a multifaced representation of multisensory motion signals in hMT+/V5 and have broader implications for our understanding of how we consider the division of sensory labor between brain regions dedicated to a specific perceptual function.


Asunto(s)
Percepción Auditiva/fisiología , Percepción de Movimiento/fisiología , Lóbulo Temporal/fisiología , Percepción Visual/fisiología , Adulto , Femenino , Humanos , Masculino , Adulto Joven
5.
Perception ; 46(12): 1356-1370, 2017 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-28718747

RESUMEN

Recent findings have shown that sounds improve visual detection in low vision individuals when the audiovisual stimuli pairs of stimuli are presented simultaneously and from the same spatial position. The present study purports to investigate the temporal aspects of the audiovisual enhancement effect previously reported. Low vision participants were asked to detect the presence of a visual stimulus (yes/no task) presented either alone or together with an auditory stimulus at different stimulus onset asynchronies (SOAs). In the first experiment, the sound was presented either simultaneously or before the visual stimulus (i.e., SOAs 0, 100, 250, 400 ms). The results show that the presence of a task-irrelevant auditory stimulus produced a significant visual detection enhancement in all the conditions. In the second experiment, the sound was either synchronized with, or randomly preceded/lagged behind the visual stimulus (i.e., SOAs 0, ± 250, ± 400 ms). The visual detection enhancement was reduced in magnitude and limited only to the synchronous condition and to the condition in which the sound stimulus was presented 250 ms before the visual stimulus. Taken together, the evidence of the present study seems to suggest that audiovisual interaction in low vision individuals is highly modulated by top-down mechanisms.


Asunto(s)
Localización de Sonidos/fisiología , Disparidad Visual/fisiología , Baja Visión/fisiopatología , Percepción Visual/fisiología , Estimulación Acústica/métodos , Adulto , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estimulación Luminosa/métodos , Tiempo de Reacción , Adulto Joven
6.
Exp Brain Res ; 235(6): 1709-1718, 2017 06.
Artículo en Inglés | MEDLINE | ID: mdl-28280879

RESUMEN

Numerous studies have found that congenitally blind individuals have better verbal memory than their normally sighted counterparts. However, it is not known whether this reflects superiority of verbal or memory abilities. In order to distinguish between these possibilities, we tested congenitally blind participants and normally sighted control participants, matched for age and education, on a range of verbal and spatial tasks. Congenitally blind participants were significantly better than sighted controls on all the verbal tasks but the groups did not differ significantly on the spatial tasks. Thus, the congenitally blind appear to have superior verbal, but not spatial, abilities. This may reflect greater reliance on verbal information and the involvement of visual cortex in language processing in the congenitally blind.


Asunto(s)
Ceguera/congénito , Ceguera/fisiopatología , Imaginación/fisiología , Lenguaje , Memoria a Corto Plazo/fisiología , Recuerdo Mental/fisiología , Percepción Espacial/fisiología , Memoria Espacial/fisiología , Aprendizaje Verbal/fisiología , Adulto , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , Percepción del Habla/fisiología , Adulto Joven
7.
Perception ; 45(3): 337-45, 2016 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-26562881

RESUMEN

Object recognition, whether visual or haptic, is impaired in sighted people when objects are rotated between learning and test, relative to an unrotated condition, that is, recognition is view-dependent. Loss of vision early in life results in greater reliance on haptic perception for object identification compared with the sighted. Therefore, we hypothesized that early blind people may be more adept at recognizing objects despite spatial transformations. To test this hypothesis, we compared early blind and sighted control participants on a haptic object recognition task. Participants studied pairs of unfamiliar three-dimensional objects and performed a two-alternative forced-choice identification task, with the learned objects presented both unrotated and rotated 180° about they-axis. Rotation impaired the recognition accuracy of sighted, but not blind, participants. We propose that, consistent with our hypothesis, haptic view-independence in the early blind reflects their greater experience with haptic object perception.


Asunto(s)
Percepción de Forma/fisiología , Patrones de Reconocimiento Fisiológico/fisiología , Reconocimiento en Psicología , Rotación , Percepción del Tacto/fisiología , Adulto , Ceguera , Estudios de Casos y Controles , Femenino , Humanos , Masculino , Persona de Mediana Edad
8.
Hum Brain Mapp ; 36(9): 3486-98, 2015 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-26109518

RESUMEN

To efficiently perceive and respond to the external environment, our brain has to perceptually integrate or segregate stimuli of different modalities. The temporal relationship between the different sensory modalities is therefore essential for the formation of different multisensory percepts. In this magnetoencephalography study, we created a paradigm where an audio and a tactile stimulus were presented by an ambiguous temporal relationship so that perception of physically identical audiotactile stimuli could vary between integrated (emanating from the same source) and segregated. This bistable paradigm allowed us to compare identical bimodal stimuli that elicited different percepts, providing a possibility to directly infer multisensory interaction effects. Local differences in alpha power over bilateral inferior parietal lobules (IPLs) and superior parietal lobules (SPLs) preceded integrated versus segregated percepts of the two stimuli (audio and tactile). Furthermore, differences in long-range cortical functional connectivity seeded in rIPL (region of maximum difference) revealed differential patterns that predisposed integrated or segregated percepts encompassing secondary areas of all different modalities and prefrontal cortex. We showed that the prestimulus brain states predispose the perception of the audiotactile stimulus both in a global and a local manner. Our findings are in line with a recent consistent body of findings on the importance of prestimulus brain states for perception of an upcoming stimulus. This new perspective on how stimuli originating from different modalities are integrated suggests a non-modality specific network predisposing multisensory perception.


Asunto(s)
Ritmo alfa , Percepción Auditiva , Lóbulo Parietal/fisiología , Percepción del Tacto , Adulto , Potenciales Evocados , Femenino , Humanos , Magnetoencefalografía , Masculino , Vías Nerviosas/fisiología , Estimulación Física , Procesamiento de Señales Asistido por Computador
9.
Front Hum Neurosci ; 8: 159, 2014.
Artículo en Inglés | MEDLINE | ID: mdl-24678294

RESUMEN

Previous studies have reported inconsistent results when comparing spatial imagery performance in the blind and the sighted, with some, but not all, studies demonstrating deficits in the blind. Here, we investigated the effect of visual status and individual preferences ("cognitive style") on performance of a spatial imagery task. Participants with blindness resulting in the loss of form vision at or after age 6, and age- and gender-matched sighted participants, performed a spatial imagery task requiring memorization of a 4 × 4 lettered matrix and subsequent mental construction of shapes within the matrix from four-letter auditory cues. They also completed the Santa Barbara Sense of Direction Scale (SBSoDS) and a self-evaluation of cognitive style. The sighted participants also completed the Object-Spatial Imagery and Verbal Questionnaire (OSIVQ). Visual status affected performance on the spatial imagery task: the blind performed significantly worse than the sighted, independently of the age at which form vision was completely lost. Visual status did not affect the distribution of preferences based on self-reported cognitive style. Across all participants, self-reported verbalizer scores were significantly negatively correlated with accuracy on the spatial imagery task. There was a positive correlation between the SBSoDS score and accuracy on the spatial imagery task, across all participants, indicating that a better sense of direction is related to a more proficient spatial representation and that the imagery task indexes ecologically relevant spatial abilities. Moreover, the older the participants were, the worse their performance was, indicating a detrimental effect of age on spatial imagery performance. Thus, spatial skills represent an important target for rehabilitative approaches to visual impairment, and individual differences, which can modulate performance, should be taken into account in such approaches.

10.
Perception ; 42(2): 233-41, 2013.
Artículo en Inglés | MEDLINE | ID: mdl-23700961

RESUMEN

It has been reported that people tend to preferentially associate phonemes like /m/, /l/, /n/ to curvilinear shapes and phonemes like /t/, /z/, /r/, /k/ to rectilinear shapes. Here we evaluated the performance of children/adolescents with autism spectrum disorders (ASD) and neurotypical controls in this audiovisual congruency phenomenon. Pairs of visual patterns (curvilinear vs rectilinear) were presented to a group of ASD participants (low- or high-functioning) and a group of age-matched neurotypical controls. Participants were asked to associate each item to non-meaningful phoneme clusters. ASD participants showed a lower proportion of expected association responses than the controls. Within the ASD group the performance varied as a function of the severity of the symptomatology. These data suggest that children/adolescents with ASD show, although at different degrees as a function of the severity of the ASD, lower phonetic-iconic congruency response patterns than neurotypical controls, pointing to poorer multisensory integration capabilities.


Asunto(s)
Asociación , Trastornos Generalizados del Desarrollo Infantil/fisiopatología , Reconocimiento Visual de Modelos/fisiología , Percepción del Habla/fisiología , Adolescente , Niño , Trastornos Generalizados del Desarrollo Infantil/psicología , Femenino , Humanos , Masculino , Pruebas Neuropsicológicas , Psicolingüística/métodos , Índice de Severidad de la Enfermedad
11.
Psychol Bull ; 139(1): 189-212, 2013 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-22612281

RESUMEN

We highlight the results of those studies that have investigated the plastic reorganization processes that occur within the human brain as a consequence of visual deprivation, as well as how these processes give rise to behaviorally observable changes in the perceptual processing of auditory and tactile information. We review the evidence showing that visual deprivation affects the establishment of the spatial coordinate systems involved in the processing of auditory and tactile inputs within the peripersonal space around an individual. In blind individuals, the absence of a conjoint activation of external coordinate systems across modalities co-occurs with a higher capacity to direct auditory and tactile attentional resources to different spatial locations and to ignore irrelevant distractors. Both processes could thus contribute to the reduced spatial multisensory binding that has been observed in those who are blind. The interplay between auditory and tactile information in visually deprived individuals is modulated by attentional factors. Blind individuals typically outperform sighted people in those tasks where the target is presented in one sensory modality (and the other modality acts as a distractor). By contrast, they are less efficient in tasks explicitly requiring the combination of information across sensory modalities. The review highlights how these behavioral effects are subserved by extensive plastic changes at the neural level, with brain areas traditionally involved in visual functioning switching and being recruited for the processing of stimuli within the intact residual senses. We also discuss the roles played by other intervening factors with regard to compensatory mechanisms, such as previous visual experience, age at onset of blindness, and learning effects.


Asunto(s)
Percepción Auditiva/fisiología , Ceguera/psicología , Procesos Mentales/fisiología , Percepción del Tacto/fisiología , Humanos
12.
Neuropsychologia ; 50(5): 576-82, 2012 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-22056506

RESUMEN

Behavioral and neurophysiological studies have shown an enhancement of visual perception in crossmodal audiovisual stimulation conditions, both for sensitivity and reaction times, when the stimulation in the two sensory modalities occurs in condition of space and time congruency. The purpose of the present work is to verify whether congruent visual and acoustic stimulations can improve the detection of visual stimuli in people affected by low vision. Participants were asked to detect the presence of a visual stimulus (yes/no task) either presented in isolation (i.e., unimodal visual stimulation) or simultaneously with auditory stimuli, which could be placed in the same spatial position (i.e., crossmodal congruent conditions) or in different spatial positions (i.e., crossmodal incongruent conditions). The results show for the first time audiovisual integration effects in low vision individuals. In particular, it has been observed a significant visual detection benefit in the crossmodal congruent as compared to the unimodal visual condition. This effect is selective for visual stimulation that occurs in the portion of visual field that is impaired, and disappears in the region of space in which vision is spared. Surprisingly, there is a marginal crossmodal benefit when the sound is presented at 16 degrees far from the visual stimulus. The observed crossmodal effect seems to be determined by the contribution of both senses to a model of optimal combination, in which the most reliable provides the highest contribution. These results, indicating a significant beneficial effect of synchronous and spatially congruent sounds in a visual detection task, seem very promising for the development of a rehabilitation approach of low vision diseases based on the principles of multisensory integration.


Asunto(s)
Localización de Sonidos/fisiología , Disparidad Visual/fisiología , Baja Visión/fisiopatología , Estimulación Acústica , Adulto , Anciano , Anciano de 80 o más Años , Análisis de Varianza , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estimulación Luminosa , Tiempo de Reacción , Adulto Joven
13.
Neuropsychologia ; 50(1): 36-43, 2012 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-22051726

RESUMEN

In the ventriloquism effect, the presentation of spatially discrepant visual information biases the localization of simultaneously presented sounds. Recently, an analogous spatial influence of touch on audition has been observed. By manipulating hand posture, it has been demonstrated that this audiotactile ventriloquist effect predominantly operates in an external frame of reference. In the present study, we examined the contribution of developmental vision to audiotactile interactions as indicated by the ventriloquism effect. Congenitally blind, late blind and sighted adults were asked to report the perceived location of sounds presented from a left, a central or a right location. Auditory stimuli were either delivered alone or concurrently with touches at the left or the right hand. The hands were located to the right and to the left of the lateral speakers and participants either adopted an uncrossed or a crossed hand posture. While sighted controls and late blind participants similarly mislocalized auditory stimuli toward the concurrent tactile stimuli in bimodal trials, the congenitally blind showed a reduced ventriloquism effect. All groups showed a reduced audiotactile ventriloquism effect in the crossed hand condition. However, the magnitude of the reduction was significantly larger in the group of congenitally blind than in the group of sighted controls. These results suggest reduced audio-tactile interactions in spatial processing following a lack of visual input from birth.


Asunto(s)
Percepción Auditiva/fisiología , Ceguera/fisiopatología , Localización de Sonidos/fisiología , Percepción Espacial/fisiología , Percepción del Tacto/fisiología , Percepción Visual/fisiología , Adulto , Edad de Inicio , Anciano , Anciano de 80 o más Años , Ceguera/congénito , Ceguera/etiología , Femenino , Mano/fisiología , Humanos , Masculino , Persona de Mediana Edad , Pruebas Neuropsicológicas , Factores de Tiempo
14.
Psychon Bull Rev ; 18(3): 429-54, 2011 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-21400125

RESUMEN

In the present review, we focus on how commonalities in the ontogenetic development of the auditory and tactile sensory systems may inform the interplay between these signals in the temporal domain. In particular, we describe the results of behavioral studies that have investigated temporal resolution (in temporal order, synchrony/asynchrony, and simultaneity judgment tasks), as well as temporal numerosity perception, and similarities in the perception of frequency across touch and hearing. The evidence reviewed here highlights features of audiotactile temporal perception that are distinctive from those seen for other pairings of sensory modalities. For instance, audiotactile interactions are characterized in certain tasks (e.g., temporal numerosity judgments) by a more balanced reciprocal influence than are other modality pairings. Moreover, relative spatial position plays a different role in the temporal order and temporal recalibration processes for audiotactile stimulus pairings than for other modality pairings. The effect exerted by both the spatial arrangement of stimuli and attention on temporal order judgments is described. Moreover, a number of audiotactile interactions occurring during sensory-motor synchronization are highlighted. We also look at the audiotactile perception of rhythm and how it may be affected by musical training. The differences emerging from this body of research highlight the need for more extensive investigation into audiotactile temporal interactions. We conclude with a brief overview of some of the key issues deserving of further research in this area.


Asunto(s)
Percepción Auditiva , Percepción del Tiempo , Percepción del Tacto , Atención/fisiología , Percepción Auditiva/fisiología , Retroalimentación Sensorial/fisiología , Humanos , Percepción Espacial/fisiología , Factores de Tiempo , Percepción del Tiempo/fisiología , Percepción del Tacto/fisiología , Percepción Visual/fisiología
15.
Neurosci Biobehav Rev ; 35(3): 589-98, 2011 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-20621120

RESUMEN

The last few years have seen a growing interest in the assessment of audiotactile interactions in information processing in peripersonal space. In particular, these studies have focused on investigating peri-hand space [corrected] and, more recently, on the functional differences that have been demonstrated between the space close to front and back of the head (i.e., the peri-head space). In this review, the issue of how audiotactile interactions vary as a function of the region of space in which stimuli are presented (i.e., front vs. rear, peripersonal vs. extra-personal) will be described. We review evidence from both monkey and human studies. This evidence, providing insight into the differential attributes qualifying the frontal and the rear regions of space, sheds light on an until now neglected research topic and may help to contribute to the formulation of new rehabilitative approaches to disorders of spatial representation. A tentative explanation of the evolutionary reasons underlying these particular patterns of results, as well as suggestions for possible future developments, are also provided.


Asunto(s)
Percepción Auditiva/fisiología , Percepción Espacial/fisiología , Tacto/fisiología , Animales , Corteza Cerebral/anatomía & histología , Corteza Cerebral/fisiología , Humanos , Estimulación Física/métodos
16.
Exp Brain Res ; 203(3): 517-32, 2010 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-20431874

RESUMEN

The Colavita effect occurs when participants performing a speeded detection/discrimination task preferentially report the visual component of pairs of audiovisual or visuotactile stimuli. To date, however, researchers have failed to demonstrate an analogous effect for audiotactile stimuli (Hecht and Reiner in Exp Brain Res 193:307-314, 2009). Here, we investigate whether an audiotactile Colavita effect can be demonstrated by manipulating either the physical features of the auditory stimuli presented in frontal (Experiment 1) or rear space (Experiment 3), or the relative and absolute position of auditory and tactile stimuli in frontal (Experiment 2) or rear space (Experiment 3). The participants showed no evidence of responding preferentially to one of the sensory components of the bimodal stimuli when they were presented from a single location in frontal space (Experiment 1). However, a significant audiotactile Colavita effect was demonstrated in Experiments 2 and 3, with participants preferentially reporting the auditory (rather than tactile) stimulus on the bimodal target trials. In Experiment 3, an audiotactile Colavita effect was reported for auditory white noise bursts but not for pure tones and selectively for those stimuli presented from the same (rather than from the opposite) side. Taken together, these results therefore suggest that when a tactile and an auditory stimulus are presented from a single frontal location, participants do not preferentially report one of the two sensory components (Experiment 1). In contrast, when the stimuli are presented from different locations, people preferentially report the auditory component, especially when they are spatially coincident (Experiments 2 and 3). Moreover, for stimuli presented from rear space, the Colavita effect was only observed for auditory stimuli consisting of white noise bursts (but not for pure tones), suggesting that this kind of stimuli are more likely to be bound together with somatosensory stimuli in rear space.


Asunto(s)
Percepción Auditiva , Discriminación en Psicología , Detección de Señal Psicológica , Percepción del Tacto , Estimulación Acústica , Adolescente , Adulto , Análisis de Varianza , Femenino , Humanos , Masculino , Estimulación Física , Psicoacústica , Tiempo de Reacción , Análisis y Desempeño de Tareas , Adulto Joven
17.
Q J Exp Psychol (Hove) ; 63(4): 694-704, 2010 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-19672794

RESUMEN

Neurophysiological and behavioural evidence now show that audiotactile interactions are more pronounced for complex auditory stimuli than for pure tones. In the present study, we examined the effect of varying the complexity of auditory stimuli (i.e., noise vs. pure tone) on participants' performance in the audiotactile cross-modal dynamic capture task. Participants discriminated the direction of a target stream (tactile or auditory) while simultaneously trying to ignore the direction of a distracting auditory or tactile apparent motion stream presented in a different sensory modality (i.e., auditory or tactile). The distractor stream could be either spatiotemporally congruent or incongruent with respect to the target stream on each trial. The results showed that sound complexity modulated performance, decreasing the accuracy of tactile direction judgements when presented simultaneously with noise distractors, while facilitating judgements of the direction of the noise bursts (as compared to pure tones). Although auditory direction judgements were overall more accurate for noise (than for pure tone) targets, the complexity of the sound failed to modulate the tactile capture of auditory targets. These results provide the first demonstration of enhanced audiotactile interactions involving complex (vs. pure tone) auditory stimuli in the peripersonal space around the hands (previously these effects have only been reported in the space around the head).


Asunto(s)
Percepción Auditiva , Sonido , Tacto , Adulto , Femenino , Humanos , Masculino , Persona de Mediana Edad , Adulto Joven
18.
Neuroreport ; 20(8): 793-7, 2009 May 27.
Artículo en Inglés | MEDLINE | ID: mdl-19369906

RESUMEN

Participants made speeded discrimination responses to unimodal auditory (low-frequency vs. high-frequency sounds) or vibrotactile stimuli (presented to the index finger, upper location vs. to the thumb, lower location). In the compatible blocks of trials, the implicitly related stimuli (i.e. higher-frequency sounds and upper tactile stimuli; and the lower-frequency sounds and the lower tactile stimuli) were associated with the same response key; in the incompatible blocks, weakly related stimuli (i.e. high-frequency sounds and lower tactile stimuli; and the low-frequency sounds and the upper tactile stimuli) were associated with the same response key. Better performance was observed in the compatible (vs. incompatible) blocks, thus providing empirical support for the cross-modal association between the relative frequency of a sound and the relative elevation of a tactile stimulus.


Asunto(s)
Orientación/fisiología , Percepción de la Altura Tonal/fisiología , Percepción Espacial/fisiología , Tacto/fisiología , Estimulación Acústica , Adulto , Cóclea/fisiología , Percepción de Profundidad/fisiología , Femenino , Dedos/inervación , Dedos/fisiología , Humanos , Masculino , Mecanorreceptores/fisiología , Pruebas Neuropsicológicas , Estimulación Física , Adulto Joven
19.
Exp Brain Res ; 193(3): 409-19, 2009 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-19011842

RESUMEN

We investigated the effect of varying sound intensity on the audiotactile crossmodal dynamic capture effect. Participants had to discriminate the direction of a target stream (tactile, Experiment 1; auditory, Experiment 2) while trying to ignore the direction of a distractor stream presented in a different modality (auditory, Experiment 1; tactile, Experiment 2). The distractor streams could either be spatiotemporally congruent or incongruent with respect to the target stream. In half of the trials, the participants were presented with auditory stimuli at 75 dB(A) while in the other half of the trials they were presented with auditory stimuli at 82 dB(A). Participants' performance on both tasks was significantly affected by the intensity of the sounds. Namely, the crossmodal capture of tactile motion by audition was stronger with the more intense (vs. less intense) auditory distractors (Experiment 1), whereas the capture effect exerted by the tactile distractors was stronger for less intense (than for more intense) auditory targets (Experiment 2). The crossmodal dynamic capture was larger in Experiment 1 than in Experiment 2, with a stronger congruency effect when the target streams were presented in the tactile (vs. auditory) modality. Two explanations are put forward to account for these results: an attentional biasing toward the more intense auditory stimuli, and a modulation induced by the relative perceptual weight of, respectively, the auditory and the tactile signals.


Asunto(s)
Atención , Localización de Sonidos , Sonido , Percepción del Tacto , Estimulación Acústica , Adulto , Análisis de Varianza , Femenino , Humanos , Masculino , Estimulación Física , Percepción Espacial , Análisis y Desempeño de Tareas , Adulto Joven
20.
Neuropsychologia ; 46(11): 2845-50, 2008 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-18603271

RESUMEN

In the present study, we examined the potential modulatory effect of relative spatial position on audiotactile temporal order judgments (TOJs) in sighted, early, and late blind adults. Pairs of auditory and tactile stimuli were presented from the left and/or right of participants at varying stimulus onset asynchronies (SOAs) using the method of constant stimuli. The participants had to make unspeeded TOJs regarding which sensory modality had been presented first on each trial. Systematic differences between the participants emerged: While the performance of the sighted participants was unaffected by whether the two stimuli were presented from the same or different positions (replicating the results of several recent studies), the blind participants (regardless of the age of onset of blindness) were significantly more accurate when the auditory and tactile stimuli were presented from different positions rather than from the same position. These results provide the first empirical evidence to suggest a spatial modulation of audiotactile interactions in a temporal task performed by visually impaired humans. The fact that the performance of the blind participants was modulated by the relative spatial position of the stimuli is consistent with data showing that visual deprivation results in an improved ability to process spatial cues within the residual tactile and auditory modalities. These results support the hypothesis that the absence of visual cues results in the emergence of more pronounced audiotactile spatial interactions.


Asunto(s)
Percepción Auditiva/fisiología , Ceguera/fisiopatología , Juicio/fisiología , Tacto/fisiología , Adulto , Análisis de Varianza , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estimulación Física/métodos , Desempeño Psicomotor/fisiología , Tiempo de Reacción/fisiología , Umbral Sensorial/fisiología , Análisis y Desempeño de Tareas
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...