Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
1.
Neuroimage ; 122: 203-13, 2015 Nov 15.
Artigo em Inglês | MEDLINE | ID: mdl-26244276

RESUMO

In everyday life, our auditory system is bombarded with many signals in complex auditory scenes. Limited processing capacities allow only a fraction of these signals to enter perceptual awareness. This magnetoencephalography (MEG) study used informational masking to identify the neural mechanisms that enable auditory awareness. On each trial, participants indicated whether they detected a pair of sequentially presented tones (i.e., the target) that were embedded within a multi-tone background. We analysed MEG activity for 'hits' and 'misses', separately for the first and second tones within a target pair. Comparing physically identical stimuli that were detected or missed provided insights into the neural processes underlying auditory awareness. While the first tone within a target elicited a stronger early P50m on hit trials, only the second tone evoked a negativity at 150 ms, which may index segregation of the tone pair from the multi-tone background. Notably, a later sustained deflection peaking around 300 and 500 ms (P300m) was the only component that was significantly amplified for both tones, when they were detected pointing towards its key role in perceptual awareness. Additional Dynamic Causal Modelling analyses indicated that the negativity at 150 ms underlying auditory stream segregation is mediated predominantly via changes in intrinsic connectivity within auditory cortices. By contrast, the later P300m response as a signature of perceptual awareness relies on interactions between parietal and auditory cortices. In conclusion, our results suggest that successful detection and hence auditory awareness of a two-tone pair within complex auditory scenes relies on recurrent processing between auditory and higher-order parietal cortices.


Assuntos
Percepção Auditiva/fisiologia , Córtex Cerebral/fisiologia , Estimulação Acústica , Adulto , Conscientização/fisiologia , Mapeamento Encefálico , Potenciais Evocados P300 , Potenciais Evocados Auditivos , Feminino , Humanos , Imageamento por Ressonância Magnética , Magnetoencefalografia , Masculino , Adulto Jovem
2.
Hum Brain Mapp ; 34(10): 2511-23, 2013 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-22505330

RESUMO

Joint attention behaviors include initiating one's own and responding to another's bid for joint attention to an object, person, or topic. Joint attention abilities in autism are pervasively atypical, correlate with development of language and social abilities, and discriminate children with autism from other developmental disorders. Despite the importance of these behaviors, the neural correlates of joint attention in individuals with autism remain unclear. This paucity of data is likely due to the inherent challenge of acquiring data during a real-time social interaction. We used a novel experimental set-up in which participants engaged with an experimenter in an interactive face-to-face joint attention game during fMRI data acquisition. Both initiating and responding to joint attention behaviors were examined as well as a solo attention (SA) control condition. Participants included adults with autism spectrum disorder (ASD) (n = 13), a mean age- and sex-matched neurotypical group (n = 14), and a separate group of neurotypical adults (n = 22). Significant differences were found between groups within social-cognitive brain regions, including dorsal medial prefrontal cortex (dMPFC) and right posterior superior temporal sulcus (pSTS), during the RJA as compared to SA conditions. Region-of-interest analyses revealed a lack of signal differentiation between joint attention and control conditions within left pSTS and dMPFC in individuals with ASD. Within the pSTS, this lack of differentiation was characterized by reduced activation during joint attention and relative hyper-activation during SA. These findings suggest a possible failure of developmental neural specialization within the STS and dMPFC to joint attention in ASD.


Assuntos
Atenção/fisiologia , Mapeamento Encefálico , Encéfalo/fisiopatologia , Transtornos Globais do Desenvolvimento Infantil/fisiopatologia , Jogos Experimentais , Relações Interpessoais , Imageamento por Ressonância Magnética , Adolescente , Adulto , Comunicação , Feminino , Humanos , Inteligência , Masculino , Córtex Pré-Frontal/fisiopatologia , Desempenho Psicomotor , Lobo Temporal/fisiopatologia , Gravação em Vídeo , Adulto Jovem
3.
Neuroimage ; 60(2): 1478-89, 2012 Apr 02.
Artigo em Inglês | MEDLINE | ID: mdl-22305992

RESUMO

To form a unified percept of our environment, the human brain integrates information within and across the senses. This MEG study investigated interactions within and between sensory modalities using a frequency analysis of steady-state responses that are elicited time-locked to periodically modulated stimuli. Critically, in the frequency domain, interactions between sensory signals are indexed by crossmodulation terms (i.e. the sums and differences of the fundamental frequencies). The 3 × 2 factorial design, manipulated (1) modality: auditory, visual or audiovisual (2) steady-state modulation: the auditory and visual signals were modulated only in one sensory feature (e.g. visual gratings modulated in luminance at 6 Hz) or in two features (e.g. tones modulated in frequency at 40 Hz & amplitude at 0.2 Hz). This design enabled us to investigate crossmodulation frequencies that are elicited when two stimulus features are modulated concurrently (i) in one sensory modality or (ii) in auditory and visual modalities. In support of within-modality integration, we reliably identified crossmodulation frequencies when two stimulus features in one sensory modality were modulated at different frequencies. In contrast, no crossmodulation frequencies were identified when information needed to be combined from auditory and visual modalities. The absence of audiovisual crossmodulation frequencies suggests that the previously reported audiovisual interactions in primary sensory areas may mediate low level spatiotemporal coincidence detection that is prominent for stimulus transients but less relevant for sustained SSR responses. In conclusion, our results indicate that information in SSRs is integrated over multiple time scales within but not across sensory modalities at the primary cortical level.


Assuntos
Córtex Auditivo/fisiologia , Percepção Auditiva/fisiologia , Magnetoencefalografia , Córtex Visual/fisiologia , Percepção Visual/fisiologia , Adulto , Feminino , Humanos , Masculino , Adulto Jovem
4.
Neuroimage ; 50(4): 1639-47, 2010 May 01.
Artigo em Inglês | MEDLINE | ID: mdl-20096792

RESUMO

Cooperative social interaction is critical for human social development and learning. Despite the importance of social interaction, previous neuroimaging studies lack two fundamental components of everyday face-to-face interactions: contingent responding and joint attention. In the current studies, functional MRI data were collected while participants interacted with a human experimenter face-to-face via live video feed as they engaged in simple cooperative games. In Experiment 1, participants engaged in a live interaction with the experimenter ("Live") or watched a video of the same interaction ("Recorded"). During the "Live" interaction, as compared to the Recorded conditions, greater activation was seen in brain regions involved in social cognition and reward, including the right temporoparietal junction (rTPJ), anterior cingulate cortex (ACC), right superior temporal sulcus (rSTS), ventral striatum, and amygdala. Experiment 2 isolated joint attention, a critical component of social interaction. Participants either followed the gaze of the live experimenter to a shared target of attention ("Joint Attention") or found the target of attention alone while the experimenter was visible but not sharing attention ("Solo Attention"). The right temporoparietal junction and right posterior STS were differentially recruited during Joint, as compared to Solo, attention. These findings suggest the rpSTS and rTPJ are key regions for both social interaction and joint attention. This method of allowing online, contingent social interactions in the scanner could open up new avenues of research in social cognitive neuroscience, both in typical and atypical populations.


Assuntos
Atenção/fisiologia , Mapeamento Encefálico/métodos , Encéfalo/fisiologia , Relações Interpessoais , Imageamento por Ressonância Magnética/métodos , Adolescente , Adulto , Feminino , Jogos Experimentais , Humanos , Masculino , Testes Neuropsicológicos , Estimulação Luminosa , Gravação em Vídeo , Adulto Jovem
5.
J Vis ; 10(10): 27, 2010 Aug 26.
Artigo em Inglês | MEDLINE | ID: mdl-20884492

RESUMO

When the two eyes are presented with dissimilar images, human observers report alternating percepts-a phenomenon coined binocular rivalry. These perceptual fluctuations reflect competition between the two visual inputs both at monocular and binocular processing stages. Here we investigated the influence of auditory stimulation on the temporal dynamics of binocular rivalry. In three psychophysics experiments, we investigated whether sounds that provide directionally congruent, incongruent, or non-motion information modulate the dominance periods of rivaling visual motion percepts. Visual stimuli were dichoptically presented random-dot kinematograms (RDKs) at different levels of motion coherence. The results show that directional motion sounds rather than auditory input per se influenced the temporal dynamics of binocular rivalry. In all experiments, motion sounds prolonged the dominance periods of the directionally congruent visual motion percept. In contrast, motion sounds abbreviated the suppression periods of the directionally congruent visual motion percepts only when they competed with directionally incongruent percepts. Therefore, analogous to visual contextual effects, auditory motion interacted primarily with consciously perceived visual input rather than visual input suppressed from awareness. Our findings suggest that auditory modulation of perceptual dominance times might be established in a top-down fashion by means of feedback mechanisms.


Assuntos
Percepção Auditiva/fisiologia , Dominância Ocular/fisiologia , Disparidade Visual/fisiologia , Visão Binocular/fisiologia , Percepção Visual/fisiologia , Adulto , Feminino , Humanos , Masculino , Estimulação Luminosa/métodos , Córtex Visual/fisiologia , Adulto Jovem
6.
Behav Res Methods ; 42(1): 212-25, 2010 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-20160301

RESUMO

Although computer keyboards and mice are frequently used in measuring response times (RTs), the accuracy of these measurements is quite low. Specialized RT collection devices must be used to obtain more accurate measurements. However, all the existing devices have some shortcomings. We have developed and implemented a new, commercially available device, the RTbox, for highly accurate RT measurements. The RTbox has its own microprocessor and high-resolution clock. It can record the identities and timing of button events with high accuracy, unaffected by potential timing uncertainty or biases during data transmission and processing in the host computer. It stores button events until the host computer chooses to retrieve them. The asynchronous storage greatly simplifies the design of user programs. The RTbox can also receive and record external signals as triggers and can measure RTs with respect to external events. The internal clock of the RTbox can be synchronized with the computer clock, so the device can be used without external triggers. A simple USB connection is sufficient to integrate the RTbox with any standard computer and operating system.


Assuntos
Tempo de Reação/fisiologia , Percepção do Tempo/fisiologia , Humanos , Modelos Psicológicos
7.
PLoS One ; 8(8): e70710, 2013.
Artigo em Inglês | MEDLINE | ID: mdl-24015176

RESUMO

Rapid integration of biologically relevant information is crucial for the survival of an organism. Most prominently, humans should be biased to attend and respond to looming stimuli that signal approaching danger (e.g. predator) and hence require rapid action. This psychophysics study used binocular rivalry to investigate the perceptual advantage of looming (relative to receding) visual signals (i.e. looming bias) and how this bias can be influenced by concurrent auditory looming/receding stimuli and the statistical structure of the auditory and visual signals. Subjects were dichoptically presented with looming/receding visual stimuli that were paired with looming or receding sounds. The visual signals conformed to two different statistical structures: (1) a 'simple' random-dot kinematogram showing a starfield and (2) a "naturalistic" visual Shepard stimulus. Likewise, the looming/receding sound was (1) a simple amplitude- and frequency-modulated (AM-FM) tone or (2) a complex Shepard tone. Our results show that the perceptual looming bias (i.e. the increase in dominance times for looming versus receding percepts) is amplified by looming sounds, yet reduced and even converted into a receding bias by receding sounds. Moreover, the influence of looming/receding sounds on the visual looming bias depends on the statistical structure of both the visual and auditory signals. It is enhanced when audiovisual signals are Shepard stimuli. In conclusion, visual perception prioritizes processing of biologically significant looming stimuli especially when paired with looming auditory signals. Critically, these audiovisual interactions are amplified for statistically complex signals that are more naturalistic and known to engage neural processing at multiple levels of the cortical hierarchy.


Assuntos
Percepção Auditiva , Visão Binocular , Estimulação Acústica , Adulto , Percepção de Distância , Feminino , Humanos , Masculino , Estimulação Luminosa , Psicofísica , Tempo de Reação , Adulto Jovem
8.
Front Hum Neurosci ; 6: 169, 2012.
Artigo em Inglês | MEDLINE | ID: mdl-22737112

RESUMO

When engaging in joint attention, one person directs another person's attention to an object (Initiating Joint Attention, IJA), and the second person's attention follows (Responding to Joint Attention, RJA). As such, joint attention must occur within the context of a social interaction. This ability is critical to language and social development; yet the neural bases for this pivotal skill remain understudied. This paucity of research is likely due to the challenge in acquiring functional MRI data during a naturalistic, contingent social interaction. To examine the neural bases of both IJA and RJA we implemented a dual-video set-up that allowed for a face-to-face interaction between subject and experimenter via video during fMRI data collection. In each trial, participants either followed the experimenter's gaze to a target (RJA) or cued the experimenter to look at the target (IJA). A control condition, solo attention (SA), was included in which the subject shifted gaze to a target while the experimenter closed her eyes. Block and event-related analyses were conducted and revealed common and distinct regions for IJA and RJA. Distinct regions included the ventromedial prefrontal cortex for RJA and intraparietal sulcus and middle frontal gyrus for IJA (as compared to SA). Conjunction analyses revealed overlap in the dorsal medial prefrontal cortex (dMPFC) and right posterior superior temporal sulcus (pSTS) for IJA and RJA (as compared to SA) for the event analyses. Functional connectivity analyses during a resting baseline suggest joint attention processes recruit distinct but interacting networks, including social-cognitive, voluntary attention orienting, and visual networks. This novel experimental set-up allowed for the identification of the neural bases of joint attention during a real-time interaction and findings suggest that whether one is the initiator or responder, the dMPFC and right pSTS, are selectively recruited during periods of joint attention.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA