Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 24
Filtrar
1.
Neuroimage ; 257: 119307, 2022 08 15.
Artículo en Inglés | MEDLINE | ID: mdl-35577024

RESUMEN

The combination of signals from different sensory modalities can enhance perception and facilitate behavioral responses. While previous research described crossmodal influences in a wide range of tasks, it remains unclear how such influences drive performance enhancements. In particular, the neural mechanisms underlying performance-relevant crossmodal influences, as well as the latency and spatial profile of such influences are not well understood. Here, we examined data from high-density electroencephalography (N = 30) recordings to characterize the oscillatory signatures of crossmodal facilitation of response speed, as manifested in the speeding of visual responses by concurrent task-irrelevant auditory information. Using a data-driven analysis approach, we found that individual gains in response speed correlated with larger beta power difference (13-25 Hz) between the audiovisual and the visual condition, starting within 80 ms after stimulus onset in the secondary visual cortex and in multisensory association areas in the parietal cortex. In addition, we examined data from electrocorticography (ECoG) recordings in four epileptic patients in a comparable paradigm. These ECoG data revealed reduced beta power in audiovisual compared with visual trials in the superior temporal gyrus (STG). Collectively, our data suggest that the crossmodal facilitation of response speed is associated with reduced early beta power in multisensory association and secondary visual areas. The reduced early beta power may reflect an auditory-driven feedback signal to improve visual processing through attentional gating. These findings improve our understanding of the neural mechanisms underlying crossmodal response speed facilitation and highlight the critical role of beta oscillations in mediating behaviorally relevant multisensory processing.


Asunto(s)
Corteza Visual , Percepción Visual , Estimulación Acústica , Atención/fisiología , Percepción Auditiva/fisiología , Electroencefalografía , Humanos , Estimulación Luminosa , Tiempo de Reacción/fisiología , Corteza Visual/fisiología , Percepción Visual/fisiología
2.
Neuroimage Clin ; 33: 102909, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-34915330

RESUMEN

OBJECTIVES: People with Schizophrenia (SZ) show deficits in auditory and audiovisual speech recognition. It is possible that these deficits are related to aberrant early sensory processing, combined with an impaired ability to utilize visual cues to improve speech recognition. In this electroencephalography study we tested this by having SZ and healthy controls (HC) identify different unisensory auditory and bisensory audiovisual syllables at different auditory noise levels. METHODS: SZ (N = 24) and HC (N = 21) identified one of three different syllables (/da/, /ga/, /ta/) at three different noise levels (no, low, high). Half the trials were unisensory auditory and the other half provided additional visual input of moving lips. Task-evoked mediofrontal N1 and P2 brain potentials triggered to the onset of the auditory syllables were derived and related to behavioral performance. RESULTS: In comparison to HC, SZ showed speech recognition deficits for unisensory and bisensory stimuli. These deficits were primarily found in the no noise condition. Paralleling these observations, reduced N1 amplitudes to unisensory and bisensory stimuli in SZ were found in the no noise condition. In HC the N1 amplitudes were positively related to the speech recognition performance, whereas no such relationships were found in SZ. Moreover, no group differences in multisensory speech recognition benefits and N1 suppression effects for bisensory stimuli were observed. CONCLUSION: Our study suggests that reduced N1 amplitudes reflect early auditory and audiovisual speech processing deficits in SZ. The findings that the amplitude effects were confined to salient speech stimuli and the attenuated relationship with behavioral performance in patients compared to HC, indicates a diminished decoding of the auditory speech signals in SZs. Our study also revealed relatively intact multisensory benefits in SZs, which implies that the observed auditory and audiovisual speech recognition deficits were primarily related to aberrant processing of the auditory syllables.


Asunto(s)
Esquizofrenia , Percepción del Habla , Estimulación Acústica , Encéfalo , Electroencefalografía , Potenciales Evocados Auditivos , Humanos , Esquizofrenia/complicaciones , Habla , Percepción del Habla/fisiología , Percepción Visual/fisiología
3.
Cereb Cortex ; 31(12): 5536-5548, 2021 10 22.
Artículo en Inglés | MEDLINE | ID: mdl-34274967

RESUMEN

Studies on schizophrenia (SCZ) and aberrant multisensory integration (MSI) show conflicting results, which are potentially confounded by attention deficits in SCZ. To test this, we examined the interplay between MSI and intersensory attention (IA) in healthy controls (HCs) (N = 27) and in SCZ (N = 27). Evoked brain potentials to unisensory-visual (V), unisensory-tactile (T), or spatiotemporally aligned bisensory VT stimuli were measured with high-density electroencephalography, while participants attended blockwise to either visual or tactile inputs. Behaviorally, IA effects in SCZ, relative to HC, were diminished for unisensory stimuli, but not for bisensory stimuli. At the neural level, we observed reduced IA effects for bisensory stimuli over mediofrontal scalp regions (230-320 ms) in SCZ. The analysis of MSI, using the additive approach, revealed multiple phases of integration over occipital and frontal scalp regions (240-364 ms), which did not differ between HC and SCZ. Furthermore, IA and MSI effects were both positively related to the behavioral performance in SCZ, indicating that IA and MSI mutually facilitate bisensory stimulus processing. Multisensory processing could facilitate stimulus processing and compensate for top-down attention deficits in SCZ. Differences in attentional demands, which may be differentially compensated by multisensory processing, could account for previous conflicting findings on MSI in SCZ.


Asunto(s)
Esquizofrenia , Estimulación Acústica/métodos , Percepción Auditiva/fisiología , Electroencefalografía/métodos , Humanos , Estimulación Luminosa/métodos , Tiempo de Reacción/fisiología , Percepción Visual/fisiología
4.
Hum Brain Mapp ; 42(2): 452-466, 2021 02 01.
Artículo en Inglés | MEDLINE | ID: mdl-33617132

RESUMEN

In the ventriloquist illusion, spatially disparate visual signals can influence the perceived location of simultaneous sounds. Previous studies have shown asymmetrical responses in auditory cortical regions following perceived peripheral sound shifts. Moreover, higher-order cortical areas perform inferences on the sources of disparate audiovisual signals. Recent studies have also highlighted top-down influence in the ventriloquist illusion and postulated a governing function of neural oscillations for crossmodal processing. In this EEG study, we analyzed source-reconstructed neural oscillations to address the question of whether perceived sound shifts affect the laterality of auditory responses. Moreover, we investigated the modulation of neural oscillations related to the occurrence of the illusion more generally. With respect to the first question, we did not find evidence for significant changes in the laterality of auditory responses due to perceived sound shifts. However, we found a sustained reduction of mediofrontal theta-band power starting prior to stimulus onset when participants perceived the illusion compared to when they did not perceive the illusion. We suggest that this effect reflects a state of diminished cognitive control, leading to reliance on more readily discriminable visual information and increased crossmodal influence. We conclude that mediofrontal theta-band oscillations serve as a neural mechanism underlying top-down modulation of crossmodal processing in the ventriloquist illusion.


Asunto(s)
Estimulación Acústica/métodos , Corteza Auditiva/fisiología , Ilusiones/fisiología , Estimulación Luminosa/métodos , Ritmo Teta/fisiología , Adulto , Corteza Auditiva/diagnóstico por imagen , Electroencefalografía/métodos , Femenino , Humanos , Masculino , Adulto Joven
5.
J Neurosci ; 41(7): 1505-1515, 2021 02 17.
Artículo en Inglés | MEDLINE | ID: mdl-33310755

RESUMEN

Integrating information across different senses is a central feature of human perception. Previous research suggests that multisensory integration is shaped by a context-dependent and largely adaptive interplay between stimulus-driven bottom-up and top-down endogenous influences. One critical question concerns the extent to which this interplay is sensitive to the amount of available cognitive resources. In the present study, we investigated the influence of limited cognitive resources on audiovisual integration by measuring high-density electroencephalography (EEG) in healthy participants performing the sound-induced flash illusion (SIFI) and a verbal n-back task (0-back, low load and 2-back, high load) in a dual-task design. In the SIFI, the integration of a flash with two rapid beeps can induce the illusory perception of two flashes. We found that high compared with low load increased illusion susceptibility and modulated neural oscillations underlying illusion-related crossmodal interactions. Illusion perception under high load was associated with reduced early ß power (18-26 Hz, ∼70 ms) in auditory and motor areas, presumably reflecting an early mismatch signal and subsequent top-down influences including increased frontal θ power (7-9 Hz, ∼120 ms) in mid-anterior cingulate cortex (ACC) and a later ß power suppression (13-22 Hz, ∼350 ms) in prefrontal and auditory cortex. Our study demonstrates that integrative crossmodal interactions underlying the SIFI are sensitive to the amount of available cognitive resources and that multisensory integration engages top-down θ and ß oscillations when cognitive resources are scarce.SIGNIFICANCE STATEMENT The integration of information across multiple senses, a remarkable ability of our perceptual system, is influenced by multiple context-related factors, the role of which is highly debated. It is, for instance, poorly understood how available cognitive resources influence crossmodal interactions during multisensory integration. We addressed this question using the sound-induced flash illusion (SIFI), a phenomenon in which the integration of two rapid beeps together with a flash induces the illusion of a second flash. Replicating our previous work, we demonstrate that depletion of cognitive resources through a working memory (WM) task increases the perception of the illusion. With respect to the underlying neural processes, we show that when available resources are limited, multisensory integration engages top-down θ and ß oscillations.


Asunto(s)
Memoria/fisiología , Neuronas/fisiología , Percepción/fisiología , Sensación/fisiología , Estimulación Acústica , Adulto , Corteza Auditiva/fisiología , Percepción Auditiva/fisiología , Ritmo beta/fisiología , Electroencefalografía , Femenino , Humanos , Ilusiones , Masculino , Memoria a Corto Plazo/fisiología , Estimulación Luminosa , Corteza Prefrontal/fisiología , Tiempo de Reacción/fisiología , Ritmo Teta/fisiología , Percepción Visual/fisiología , Adulto Joven
6.
Sci Rep ; 10(1): 11872, 2020 07 17.
Artículo en Inglés | MEDLINE | ID: mdl-32681138

RESUMEN

Patients with schizophrenia (ScZ) often show impairments in auditory information processing. These impairments have been related to clinical symptoms, such as auditory hallucinations. Some researchers have hypothesized that aberrant low-frequency oscillations contribute to auditory information processing deficits in ScZ. A paradigm for which modulations in low-frequency oscillations are consistently found in healthy individuals is the auditory continuity illusion (ACI), in which restoration processes lead to a perceptual grouping of tone fragments and a mask, so that a physically interrupted sound is perceived as continuous. We used the ACI paradigm to test the hypothesis that low-frequency oscillations play a role in aberrant auditory information processing in patients with ScZ (N = 23). Compared with healthy control participants we found that patients with ScZ show elevated continuity illusions of interrupted, partially-masked tones. Electroencephalography data demonstrate that this elevated continuity perception is reflected by diminished 3 Hz power. This suggests that reduced low-frequency oscillations relate to elevated restoration processes in ScZ. Our findings support the hypothesis that aberrant low-frequency oscillations contribute to altered perception-related auditory information processing in ScZ.


Asunto(s)
Alucinaciones , Ilusiones/psicología , Esquizofrenia/diagnóstico , Psicología del Esquizofrénico , Estimulación Acústica , Análisis de Datos , Electroencefalografía , Potenciales Evocados Auditivos , Femenino , Humanos , Masculino
7.
Eur J Neurosci ; 48(8): 2849-2856, 2018 10.
Artículo en Inglés | MEDLINE | ID: mdl-29430753

RESUMEN

Interruptions in auditory input can be perceptually restored if they coincide with a masking sound, resulting in a continuity illusion. Previous studies have shown that this continuity illusion is associated with reduced low-frequency neural oscillations in the auditory cortex. However, the precise contribution of oscillatory amplitude changes and phase alignment to auditory restoration remains unclear. Using electroencephalography, we investigated induced power changes and phase locking in response to 3 Hz amplitude-modulated tones during the interval of an interrupting noise. We experimentally manipulated both the physical continuity of the tone (continuous vs. interrupted) and the masking potential of the noise (notched vs. full). We observed an attenuation of 3 Hz power during continuity illusions in comparison with both continuous tones and veridically perceived interrupted tones. This illusion-related suppression of low-frequency oscillations likely reflects a blurring of auditory object boundaries that supports continuity perception. We further observed increased 3 Hz phase locking during fully masked continuous tones compared with the other conditions. This low-frequency phase alignment may reflect the neural registration of the interrupting noise as a newly appearing object, whereas during continuity illusions, a spectral portion of this noise is delegated to filling the interruption. Taken together, our findings suggest that the suppression of slow cortical oscillations in both the power and phase domains supports perceptual restoration of interruptions in auditory input.


Asunto(s)
Estimulación Acústica/métodos , Percepción Auditiva/fisiología , Electroencefalografía/métodos , Ilusiones/fisiología , Enmascaramiento Perceptual/fisiología , Adulto , Femenino , Humanos , Masculino , Persona de Mediana Edad
8.
Neuroimage ; 148: 230-239, 2017 03 01.
Artículo en Inglés | MEDLINE | ID: mdl-28108395

RESUMEN

Intersensory attention (IA) describes the process of directing attention to a specific modality. Temporal orienting (TO) characterizes directing attention to a specific moment in time. Previously, studies indicated that these two processes could have opposite effects on early evoked brain activity. The exact time-course and processing stages of both processes are still unknown. In this human electroencephalography study, we investigated the effects of IA and TO on visuo-tactile stimulus processing within one paradigm. IA was manipulated by presenting auditory cues to indicate whether participants should detect visual or tactile targets in visuo-tactile stimuli. TO was manipulated by presenting stimuli block-wise at fixed or variable inter-stimulus intervals. We observed that TO affects evoked activity to visuo-tactile stimuli prior to IA. Moreover, we found that TO reduces the amplitude of early evoked brain activity, whereas IA enhances it. Using beamformer source-localization, we observed that IA increases neural responses in sensory areas of the attended modality whereas TO reduces brain activity in widespread cortical areas. Based on these findings we derive an updated working model for the effects of temporal and intersensory attention on early evoked brain activity.


Asunto(s)
Atención/fisiología , Encéfalo/fisiología , Percepción del Tiempo/fisiología , Estimulación Acústica , Mapeo Encefálico , Señales (Psicología) , Electroencefalografía , Potenciales Evocados/fisiología , Femenino , Humanos , Masculino , Orientación/fisiología , Estimulación Luminosa , Tiempo de Reacción/fisiología , Tacto/fisiología , Adulto Joven
9.
Cortex ; 74: 277-88, 2016 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-26716405

RESUMEN

Intersensory attention (IA) describes our ability to attend to stimuli of one sensory modality, while disregarding other modalities. Temporal prediction (TP) describes the process of directing attention to specific moments in time. Both attention mechanisms facilitate sensory stimulus processing, yet it is not understood whether they rely on common or distinct network patterns. In this electroencephalography (EEG) study, we presented auditory cues followed by visuo-tactile stimuli. The cues indicated whether participants should detect visual or tactile targets in the visuo-tactile stimuli. TP was manipulated by presenting stimuli block-wise at fixed or variable inter-stimulus intervals. We analysed power and functional connectivity of source-projected oscillations. We computed graph theoretical measures to identify networks underlying IA and TP. Participants responded faster when stimuli were presented with fixed compared to variable inter-stimulus intervals, demonstrating a facilitating effect of TP. Distinct patterns of local delta-, alpha-, and beta-band power modulations and differential functional connectivity in the alpha- and beta-bands reflected the influence of IA and TP. An interaction between IA and TP was found in theta-band connectivity in a network comprising frontal, somatosensory and parietal areas. Our study provides insights into how IA and TP dynamically shape oscillatory power and functional connectivity to facilitate stimulus processing.


Asunto(s)
Atención/fisiología , Percepción Auditiva/fisiología , Encéfalo/fisiología , Red Nerviosa/fisiología , Percepción del Tacto/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Ritmo alfa/fisiología , Señales (Psicología) , Femenino , Humanos , Masculino , Estimulación Luminosa , Estimulación Física , Ritmo Teta/fisiología , Adulto Joven
10.
Neuroimage ; 125: 724-730, 2016 Jan 15.
Artículo en Inglés | MEDLINE | ID: mdl-26546865

RESUMEN

In everyday life we are confronted with inputs of multisensory stimuli that need to be integrated across our senses. Individuals vary considerably in how they integrate multisensory information, yet the neurochemical foundations underlying this variability are not well understood. Neural oscillations, especially in the gamma band (>30Hz) play an important role in multisensory processing. Furthermore, gamma-aminobutyric acid (GABA) neurotransmission contributes to the generation of gamma band oscillations (GBO), which can be sustained by activation of metabotropic glutamate receptors. Hence, differences in the GABA and glutamate systems might contribute to individual differences in multisensory processing. In this combined magnetic resonance spectroscopy and electroencephalography study, we examined the relationships between GABA and glutamate concentrations in the superior temporal sulcus (STS), source localized GBO, and illusion rate in the sound-induced flash illusion (SIFI). In 39 human volunteers we found robust relationships between GABA concentration, GBO power, and the SIFI perception rate (r-values=0.44 to 0.53). The correlation between GBO power and SIFI perception rate was about twofold higher when the modulating influence of the GABA level was included in the analysis as compared to when it was excluded. No significant effects were obtained for glutamate concentration. Our study suggests that the GABA level shapes individual differences in audiovisual perception through its modulating influence on GBO. GABA neurotransmission could be a promising target for treatment interventions of multisensory processing deficits in clinical populations, such as schizophrenia or autism.


Asunto(s)
Ilusiones Ópticas/fisiología , Lóbulo Temporal/fisiología , Percepción Visual/fisiología , Ácido gamma-Aminobutírico/biosíntesis , Estimulación Acústica , Adolescente , Adulto , Percepción Auditiva/fisiología , Electroencefalografía , Femenino , Humanos , Imagen por Resonancia Magnética , Espectroscopía de Resonancia Magnética , Masculino , Persona de Mediana Edad , Estimulación Luminosa , Adulto Joven , Ácido gamma-Aminobutírico/análisis
11.
PLoS One ; 9(8): e103238, 2014.
Artículo en Inglés | MEDLINE | ID: mdl-25116195

RESUMEN

Many electronic devices that we use in our daily lives provide inputs that need to be processed and integrated by our senses. For instance, ringing, vibrating, and flashing indicate incoming calls and messages in smartphones. Whether the presentation of multiple smartphone stimuli simultaneously provides an advantage over the processing of the same stimuli presented in isolation has not yet been investigated. In this behavioral study we examined multisensory processing between visual (V), tactile (T), and auditory (A) stimuli produced by a smartphone. Unisensory V, T, and A stimuli as well as VA, AT, VT, and trisensory VAT stimuli were presented in random order. Participants responded to any stimulus appearance by touching the smartphone screen using the stimulated hand (Experiment 1), or the non-stimulated hand (Experiment 2). We examined violations of the race model to test whether shorter response times to multisensory stimuli exceed probability summations of unisensory stimuli. Significant violations of the race model, indicative of multisensory processing, were found for VA stimuli in both experiments and for VT stimuli in Experiment 1. Across participants, the strength of this effect was not associated with prior learning experience and daily use of smartphones. This indicates that this integration effect, similar to what has been previously reported for the integration of semantically meaningless stimuli, could involve bottom-up driven multisensory processes. Our study demonstrates for the first time that multisensory processing of smartphone stimuli facilitates taking a call. Thus, research on multisensory integration should be taken into consideration when designing electronic devices such as smartphones.


Asunto(s)
Teléfono Celular , Sensación , Estimulación Acústica , Adulto , Femenino , Humanos , Masculino , Estimulación Luminosa , Tiempo de Reacción , Encuestas y Cuestionarios , Adulto Joven
12.
Hum Brain Mapp ; 35(7): 3107-21, 2014 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-24123535

RESUMEN

In normal-hearing listeners, localization of auditory speech involves stimulus processing in the postero-dorsal pathway of the auditory system. In quiet environments, bilateral cochlear implant (CI) users show high speech recognition performance, but localization of auditory speech is poor, especially when discriminating stimuli from the same hemifield. Whether this difficulty relates to the inability of the auditory system to translate binaural electrical cues into neural signals, or to a functional reorganization of auditory cortical pathways following long periods of binaural deprivation is unknown. In this electroencephalography study, we examined the processing of auditory syllables in postlingually deaf adults with bilateral CIs and in normal-hearing adults. Participants were instructed to either recognize ("recognition" task) or localize ("localization" task) the syllables. The analysis focused on event-related potentials and oscillatory brain responses. N1 amplitudes in CI users were larger in the localization compared with recognition task, suggesting an enhanced stimulus processing effort in the localization task. Linear beamforming of oscillatory activity in CI users revealed stronger suppression of beta-band activity after 200 ms in the postero-dorsal auditory pathway for the localization compared with the recognition task. In normal-hearing adults, effects for longer latency event-related potentials were found, but no effects were observed for N1 amplitudes or beta-band responses. Our study suggests that difficulties in speech localization in bilateral CI users are not reflected in a functional reorganization of cortical auditory pathways. New signal processing strategies of cochlear devices preserving unambiguous binaural cues may improve auditory localization performance in bilateral CI users.


Asunto(s)
Vías Auditivas/fisiopatología , Ritmo beta/fisiología , Sordera/fisiopatología , Localización de Sonidos/fisiología , Percepción del Habla/fisiología , Habla , Estimulación Acústica , Adulto , Análisis de Varianza , Mapeo Encefálico , Implantación Coclear/métodos , Implantes Cocleares , Sordera/terapia , Electroencefalografía , Femenino , Análisis de Fourier , Humanos , Masculino , Persona de Mediana Edad
13.
Neuroimage ; 56(4): 2200-8, 2011 Jun 15.
Artículo en Inglés | MEDLINE | ID: mdl-21497200

RESUMEN

A major determinant of multisensory integration, derived from single-neuron studies in animals, is the principle of inverse effectiveness (IE), which describes the phenomenon whereby maximal multisensory response enhancements occur when the constituent unisensory stimuli are minimally effective in evoking responses. Human behavioral studies, which have shown that multisensory interactions are strongest when stimuli are low in intensity are in agreement with the IE principle, but the neurophysiologic basis for this finding is unknown. In this high-density electroencephalography (EEG) study, we examined effects of stimulus intensity on multisensory audiovisual processing in event-related potentials (ERPs) and response time (RT) facilitation in the bisensory redundant target effect (RTE). The RTE describes that RTs are faster for bisensory redundant targets than for the respective unisensory targets. Participants were presented with semantically meaningless unisensory auditory, unisensory visual and bisensory audiovisual stimuli of low, middle and high intensity, while they were instructed to make a speeded button response when a stimulus in either modality was presented. Behavioral data showed that the RTE exceeded predictions on the basis of probability summations of unisensory RTs, indicative of integrative multisensory processing, but only for low intensity stimuli. Paralleling this finding, multisensory interactions in short latency (40-60ms) ERPs with a left posterior and right anterior topography were found particularly for stimuli with low intensity. Our findings demonstrate that the IE principle is applicable to early multisensory processing in humans.


Asunto(s)
Percepción Auditiva/fisiología , Encéfalo/fisiología , Potenciales Evocados/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Adulto , Electroencefalografía , Humanos , Persona de Mediana Edad , Estimulación Luminosa , Tiempo de Reacción/fisiología , Procesamiento de Señales Asistido por Computador , Adulto Joven
14.
J Neurosci ; 31(7): 2502-10, 2011 Feb 16.
Artículo en Inglés | MEDLINE | ID: mdl-21325518

RESUMEN

When visual sensory information is restricted, we often rely on haptic and auditory information to recognize objects. Here we examined how haptic exploration of familiar objects affects neural processing of subsequently presented sounds of objects. Recent studies indicated that oscillatory responses, in particular in the gamma band (30-100 Hz), reflect cross-modal processing, but it is not clear which cortical networks are involved. In this high-density EEG study, we measured gamma-band activity (GBA) in humans performing a haptic-to-auditory priming paradigm. Haptic stimuli served as primes, and sounds of objects as targets. Haptic and auditory stimuli were either semantically congruent or incongruent, and participants were asked to categorize the objects represented by the sounds. Response times were shorter for semantically congruent compared with semantically incongruent inputs. This haptic-to-auditory priming effect was associated with enhanced total power GBA (250-350 ms) for semantically congruent inputs and additional effects of semantic congruency on evoked GBA (50-100 ms). Source reconstruction of total GBA using linear beamforming revealed effects of semantic congruency in the left lateral temporal lobe, possibly reflecting matching of information across modalities. For semantically incongruent inputs, total GBA was enhanced in middle frontal cortices, possibly indicating the processing or detection of conflicting information. Our findings demonstrate that semantic priming by haptic object exploration affects processing of auditory inputs in the lateral temporal lobe and suggest an important role of oscillatory activity for multisensory processing.


Asunto(s)
Percepción Auditiva/fisiología , Mapeo Encefálico , Potenciales Evocados/fisiología , Reconocimiento en Psicología/fisiología , Tacto/fisiología , Estimulación Acústica/métodos , Adulto , Femenino , Humanos , Masculino , Tiempo de Reacción/fisiología , Semántica , Análisis Espectral , Factores de Tiempo , Adulto Joven
15.
Exp Brain Res ; 198(2-3): 313-28, 2009 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-19495733

RESUMEN

The temporal asynchrony between inputs to different sensory modalities has been shown to be a critical factor influencing the interaction between such inputs. We used scalp-recorded event-related potentials (ERPs) to investigate the effects of attention on the processing of audiovisual multisensory stimuli as the temporal asynchrony between the auditory and visual inputs varied across the audiovisual integration window (i.e., up to 125 ms). Randomized streams of unisensory auditory stimuli, unisensory visual stimuli, and audiovisual stimuli (consisting of the temporally proximal presentation of the visual and auditory stimulus components) were presented centrally while participants attended to either the auditory or the visual modality to detect occasional target stimuli in that modality. ERPs elicited by each of the contributing sensory modalities were extracted by signal processing techniques from the combined ERP waveforms elicited by the multisensory stimuli. This was done for each of the five different 50-ms subranges of stimulus onset asynchrony (SOA: e.g., V precedes A by 125-75 ms, by 75-25 ms, etc.). The extracted ERPs for the visual inputs of the multisensory stimuli were compared among each other and with the ERPs to the unisensory visual control stimuli, separately when attention was directed to the visual or to the auditory modality. The results showed that the attention effects on the right-hemisphere visual P1 was largest when auditory and visual stimuli were temporally aligned. In contrast, the N1 attention effect was smallest at this latency, suggesting that attention may play a role in the processing of the relative temporal alignment of the constituent parts of multisensory stimuli. At longer latencies an occipital selection negativity for the attended versus unattended visual stimuli was also observed, but this effect did not vary as a function of SOA, suggesting that by that latency a stable representation of the auditory and visual stimulus components has been established.


Asunto(s)
Atención/fisiología , Percepción Auditiva/fisiología , Encéfalo/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Adulto , Electroencefalografía , Potenciales Evocados , Femenino , Humanos , Masculino , Análisis Multivariante , Pruebas Neuropsicológicas , Estimulación Luminosa , Tiempo de Reacción , Procesamiento de Señales Asistido por Computador , Análisis y Desempeño de Tareas , Factores de Tiempo
16.
Exp Brain Res ; 198(2-3): 363-72, 2009 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-19458939

RESUMEN

In real-world situations, the integration of sensory information in working memory (WM) is an important mechanism for the recognition of objects. Studies in single sensory modalities show that object recognition is facilitated if bottom-up inputs match a template held in WM, and that this effect may be linked to enhanced synchronization of neurons in the gamma-band (>30 Hz). Natural objects, however, frequently provide inputs to multiple sensory modalities. In this EEG study, we examined the integration of semantically matching or non-matching visual and auditory inputs using a delayed visual-to-auditory object-matching paradigm. In the event-related potentials (ERPs) triggered by auditory inputs, effects of semantic matching were observed after 120-170 ms at frontal and posterior regions, indicating WM-specific processing across modalities, and after 250-400 ms over medial-central regions, possibly reflecting the contextual integration of sensory inputs. Additionally, total gamma-band activity (GBA) with medial-central topography after 120-180 ms was larger for matching compared to non-matching trials. This demonstrates that multisensory matching in WM is reflected by GBA and that dynamic coupling of neural populations in this frequency range might be a crucial mechanism for integrative multisensory processes.


Asunto(s)
Percepción Auditiva/fisiología , Encéfalo/fisiología , Memoria a Corto Plazo/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Adulto , Análisis de Varianza , Mapeo Encefálico , Electroencefalografía , Potenciales Evocados , Femenino , Humanos , Masculino , Pruebas Neuropsicológicas , Estimulación Luminosa , Tiempo de Reacción , Factores de Tiempo , Adulto Joven
17.
Neuroimage ; 36(3): 877-88, 2007 Jul 01.
Artículo en Inglés | MEDLINE | ID: mdl-17481922

RESUMEN

In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA) source estimation to study the integration of multisensory objects in motion as reflected in event-related potentials (ERPs). A randomized stream of naturalistic multisensory-audiovisual (AV), unisensory-auditory (A), and unisensory-visual (V) "splash" clips (i.e., a drop falling and hitting a water surface) was presented among non-naturalistic abstract motion stimuli. The visual clip onset preceded the "splash" onset by 100 ms for multisensory stimuli. For naturalistic objects early multisensory integration effects beginning 120-140 ms after sound onset were observed over posterior scalp, with distributed sources localized to occipital cortex, temporal lobule, insular, and medial frontal gyrus (MFG). These effects, together with longer latency interactions (210-250 and 300-350 ms) found in a widespread network of occipital, temporal, and frontal areas, suggest that naturalistic objects in motion are processed at multiple stages of multisensory integration. The pattern of integration effects differed considerably for non-naturalistic stimuli. Unlike naturalistic objects, no early interactions were found for non-naturalistic objects. The earliest integration effects for non-naturalistic stimuli were observed 210-250 ms after sound onset including large portions of the inferior parietal cortex (IPC). As such, there were clear differences in the cortical networks activated by multisensory motion stimuli as a consequence of the semantic relatedness (or lack thereof) of the constituent sensory elements.


Asunto(s)
Encéfalo/fisiología , Percepción de Movimiento/fisiología , Estimulación Acústica , Adulto , Mapeo Encefálico , Interpretación Estadística de Datos , Electroencefalografía , Potenciales Evocados/fisiología , Femenino , Humanos , Modelos Lineales , Masculino , Modelos Neurológicos , Lóbulo Parietal/fisiología , Estimulación Luminosa , Corteza Prefrontal/fisiología
18.
Neuropsychologia ; 45(3): 561-71, 2007 Feb 01.
Artículo en Inglés | MEDLINE | ID: mdl-16542688

RESUMEN

The synchronous occurrence of the unisensory components of a multisensory stimulus contributes to their successful merging into a coherent perceptual representation. Oscillatory gamma-band responses (GBRs, 30-80 Hz) have been linked to feature integration mechanisms and to multisensory processing, suggesting they may also be sensitive to the temporal alignment of multisensory stimulus components. Here we examined the effects on early oscillatory GBR brain activity of varying the precision of the temporal synchrony of the unisensory components of an audio-visual stimulus. Audio-visual stimuli were presented with stimulus onset asynchronies ranging from -125 to +125 ms. Randomized streams of auditory (A), visual (V), and audio-visual (AV) stimuli were presented centrally while subjects attended to either the auditory or visual modality to detect occasional targets. GBRs to auditory and visual components of multisensory AV stimuli were extracted for five subranges of asynchrony (e.g., A preceded by V by 100+/-25 ms, by 50+/-25 ms, etc.) and compared with GBRs to unisensory control stimuli. Robust multisensory interactions were observed in the early GBRs when the auditory and visual stimuli were presented with the closest synchrony. These effects were found over medial-frontal brain areas after 30-80 ms and over occipital brain areas after 60-120 ms. A second integration effect, possibly reflecting the perceptual separation of the two sensory inputs, was found over occipital areas when auditory inputs preceded visual by 100+/-25 ms. No significant interactions were observed for the other subranges of asynchrony. These results show that the precision of temporal synchrony can have an impact on early cross-modal interactions in human cortex.


Asunto(s)
Percepción Auditiva/fisiología , Mapeo Encefálico , Corteza Cerebral/fisiología , Potenciales Evocados/fisiología , Percepción Visual/fisiología , Estimulación Acústica/métodos , Adulto , Análisis de Varianza , Relojes Biológicos , Electroencefalografía , Femenino , Lateralidad Funcional , Humanos , Masculino , Estimulación Luminosa/métodos , Tiempo de Reacción/fisiología
19.
Cereb Cortex ; 16(11): 1556-65, 2006 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-16357336

RESUMEN

Bisensory redundant targets are processed faster than the respective unisensory target stimuli alone as evidenced by substantially faster reaction times (RTs). This multisensory RT facilitation has been interpreted as an expression of integrative processing between the different sensory modalities. However, the neuronal mechanisms underlying the RT facilitation effect are not well understood. Oscillatory responses in the beta frequency range (13-30 Hz) have been related to sensory-motor processing. Here, we investigated whether modulation of beta responses might also underlie the faster RTs seen for multisensory stimuli. Using high-density electrical mapping, we explored the association between early (50-170 ms) multisensory processing in the evoked beta response and RTs recorded during a simple RT task. Subjects were instructed to indicate the appearance of any stimulus in a stream of auditory-alone (A), visual-alone (V), and multisensory (AV) stimuli by a button press. Beta responses were analyzed using Morlet wavelet transformations. Multisensory interactions were found over frontal, occipital, central, and sensory-motor regions. Critically, beta activity correlated with mean RTs over all stimulus types. Significant negative correlations were found for frontal, occipital, and sensory-motor scalp regions. We conclude that the association between oscillatory beta activity and integrative multisensory processing is directly linked to multisensory RT facilitation effects.


Asunto(s)
Ritmo beta , Tiempo de Reacción/fisiología , Estimulación Acústica , Adulto , Mapeo Encefálico , Electroencefalografía/estadística & datos numéricos , Electrofisiología , Femenino , Humanos , Masculino , Modelos Neurológicos , Modelos Estadísticos , Estimulación Luminosa
20.
Exp Brain Res ; 166(3-4): 411-26, 2005 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-16151775

RESUMEN

Here we describe an EEG study investigating the interactions between multisensory (audio-visual) integration and spatial attention, using oscillatory gamma-band responses (GBRs). The results include a comparison with previously reported event-related potential (ERP) findings from the same paradigm. Unisensory-auditory (A), unisensory-visual (V), and multisensory (AV) stimuli were presented to the left and right hemispaces while subjects attended to a designated side to detect deviant target stimuli in either sensory modality. For attended multisensory stimuli we observed larger evoked GBRs approximately 40-50 ms post-stimulus over medial-frontal brain areas compared with those same multisensory stimuli when unattended. Further analysis indicated that the integration effect and its attentional enhancement may be caused in part by a stimulus-triggered phase resetting of ongoing gamma-band responses. Interestingly, no such early interaction effects (<90 ms) could be found in the ERP waveforms, suggesting that oscillatory GBRs may be more sensitive than ERPs to these early latency attention effects. Moreover, no GBR attention effects could be found for the unisensory auditory or unisensory visual stimuli, suggesting that attention particularly affects the integrative processing of audiovisual stimuli at these early latencies.


Asunto(s)
Atención/fisiología , Electroencefalografía , Procesos Mentales/fisiología , Percepción Espacial/fisiología , Estimulación Acústica , Adulto , Mapeo Encefálico , Electrofisiología , Potenciales Evocados/fisiología , Femenino , Lateralidad Funcional/fisiología , Humanos , Masculino , Estimulación Luminosa , Tiempo de Reacción/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA