Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 13 de 13
1.
Clin Neurophysiol ; 131(5): 1102-1118, 2020 05.
Article En | MEDLINE | ID: mdl-32200092

OBJECTIVE: Stroke lesions in non-auditory areas may affect higher-order central auditory processing. We sought to characterize auditory functions in chronic stroke survivors with unilateral arm/hand impairment using auditory evoked responses (AERs) with lesion and perception metrics. METHODS: The AERs in 29 stroke survivors and 14 controls were recorded with single tones, active and passive frequency-oddballs, and a dual-oddball with pitch-contour and time-interval deviants. Performance in speech-in-noise, mistuning detection, and moving-sound detection was assessed. Relationships between AERs, behaviour, and lesion overlap with functional networks, were examined. RESULTS: Despite their normal hearing, eight patients showed unilateral AER in the hemisphere ipsilateral to the affected hand with reduced amplitude compared to those with bilateral AERs. Both groups showed increasing attenuation of later components. Hemispheric asymmetry of AER sources was reduced in bilateral-AER patients. The N1 wave (100 ms latency) and P2 (200 ms) were delayed in individuals with lesions in the basal-ganglia and white-matter, while lesions in the attention network reduced the frequency-MMN (mismatch negativity) responses and increased the pitch-contour P3a response. Patients' impaired speech-in-noise perception was explained by AER measures and frequency-deviant detection performance with multiple regression. CONCLUSION: AERs reflect disruption of auditory functions due to damage outside of temporal lobe, and further explain complexity of neural mechanisms underlying higher-order auditory perception. SIGNIFICANCE: Stroke survivors without obvious hearing problems may benefit from rehabilitation for central auditory processing.


Auditory Perception/physiology , Evoked Potentials, Auditory/physiology , Hearing Loss , Magnetoencephalography/methods , Stroke/diagnostic imaging , Stroke/physiopathology , Acoustic Stimulation/methods , Adult , Aged , Brain/diagnostic imaging , Brain/physiopathology , Chronic Disease , Female , Humans , Magnetic Resonance Imaging/methods , Male , Middle Aged , Reaction Time/physiology
3.
Int J Pediatr Otorhinolaryngol ; 123: 26-32, 2019 Aug.
Article En | MEDLINE | ID: mdl-31055204

OBJECTIVE: To assess the capacity of two parental report questionnaires, OMQ-14 and ECLiPS, to support clinical-decision making in children affected by Otitis Media with Effusion (OME). DESIGN: OMQ-14 and ECLiPS were administered twice to 90 children aged 2-12 years, three months apart, or 3 months after surgery to insert ventilation tubes (VT). Children were subdivided according to clinical diagnosis into VT (n = 25) and Active Observation (AO; n = 20), and compared with healthy control children (n = 45). Data were analyzed at group level using repeated measures ANOVA, and at individual level using Receiver Operator Characteristics (ROC) curves and confusion matrices. RESULTS: Both OMQ-14 and ECLiPS were sensitive to the presence of OME, and also to improvements in hearing post-surgery. Both were also good at classifying children into their clinically-established diagnostic groups based on score cut-offs determined using Receiver Operator Characteristics (ROC) curves. However, outputs from confusion matrices suggest only around 50% of children after VTs would be indistinguishable from controls following VT surgery. Differences were observed in which children were identified as still having problems according to the questionnaires. OMQ-14 is more sensitive to disease-related hearing loss, while the ECLiPS is more sensitive to developmental difficulties. CONCLUSIONS: Despite being developed with different aims in mind, the OMQ-14 and ECLiPS were similarly sensitive both to symptoms of disease-related hearing difficulty and also to treatment-related improvements in hearing. A significant number of VT children continue to have poor OMQ-14 and ECLiPS scores relative to control children. ECLiPS scores do not always change in a way that hearing improvements would predict, suggesting the ECLiPS is sensitive to wider developmental difficulties. Parental report in the form of narrow or broad-based questionnaires may complement history-taking and audiometry to enhance the quality of discussion between carers and clinicians about OME management.


Developmental Disabilities/diagnosis , Hearing Loss/diagnosis , Otitis Media with Effusion/diagnosis , Otitis Media with Effusion/therapy , Surveys and Questionnaires , Audiometry , Child , Child, Preschool , Clinical Decision-Making , Decision Support Techniques , Developmental Disabilities/etiology , Female , Hearing Loss/etiology , Humans , Male , Middle Ear Ventilation , Otitis Media with Effusion/complications , Parents , ROC Curve , Watchful Waiting
4.
Acta Psychol (Amst) ; 191: 69-75, 2018 11.
Article En | MEDLINE | ID: mdl-30223147

An increasing amount of studies indicates that experiencing increased task demands, triggered for example by conflicting stimulus features or low perceptual fluency, lead to processing adjustments. While these demand-triggered processing adjustments have been shown for different paradigms (e.g., response conflict tasks, perceptual disfluency, task switching, dual tasking), most of them are restricted to the visual modality. The present study investigated as to whether the challenge to understand speech signals in normal-hearing subjects would also lead to sequential processing adjustments if the processing fluency of the respective auditory signals changes from trial to trial. To that end, we used spoken number words (one to nine) that were either presented with high (clean speech) or low perceptual fluency (i.e., vocoded speech as used in cochlear implants-Experiment 1; speech embedded in multi-speaker babble noise as typically found in bars-Experiment 2). Participants had to judge the spoken number words as smaller or larger than five. Results show that the fluency effect (performance difference between high and low perceptual fluency) in both experiments was smaller following disfluent words. Thus, if it's hard to understand, you try harder.


Speech Perception/physiology , Adult , Female , Humans , Language , Language Tests , Male , Memory, Short-Term , Neuropsychological Tests , Speech , Speech Discrimination Tests , Task Performance and Analysis , Young Adult
5.
J Speech Lang Hear Res ; 59(3): 501-10, 2016 06 01.
Article En | MEDLINE | ID: mdl-27124083

PURPOSE: Continuous performance tasks (CPTs) are used to measure individual differences in sustained attention. Many different stimuli have been used as response targets without consideration of their impact on task performance. Here, we compared CPT performance in typically developing adults and children to assess the role of stimulus processing on error rates and reaction times. METHOD: Participants completed a CPT that was based on response to infrequent targets, while monitoring and withholding responses to regular nontargets. Performance on 3 stimulus conditions was compared: visual letters (X and O), their auditory analogs, and auditory pure tones. RESULTS: Adults showed no difference in error propensity across the 3 conditions but had slower reaction times for auditory stimuli. Children had slower overall reaction times. They responded most quickly to the visual target and most slowly to the tone target. They also made more errors in the tone condition than in either the visual or the auditory spoken CPT conditions. CONCLUSIONS: The results suggest error propensity and reaction time variations on CPTs cannot solely be interpreted as evidence of inattention. They also reflect stimulus-specific influences that must be considered when testing hypotheses about modality-specific deficits in sustained attention in populations with different developmental disorders.


Attention , Auditory Perception , Neuropsychological Tests , Visual Perception , Acoustic Stimulation , Adolescent , Adult , Aging/psychology , Child , Female , Humans , Male , Memory, Short-Term , Middle Aged , Photic Stimulation , Psychology, Child , Reaction Time , Young Adult
6.
Cell Tissue Res ; 361(1): 371-86, 2015 Jul.
Article En | MEDLINE | ID: mdl-26077928

Auditory spatial processing is an important ability in everyday life and allows the processing of omnidirectional information. In this review, we report and compare data from psychoacoustic and electrophysiological experiments on sound localisation accuracy and auditory spatial discrimination in infants, children, and young and older adults. The ability to process auditory spatial information changes over lifetime: the perception of the acoustic space develops from an initially imprecise representation in infants and young children to a concise representation of spatial positions in young adults and the respective performance declines again in older adults. Localisation accuracy shows a strong deterioration in older adults, presumably due to declined processing of binaural temporal and monaural spectro-temporal cues. When compared to young adults, the thresholds for spatial discrimination were strongly elevated both in young children and older adults. Despite the consistency of the measured values the underlying causes for the impaired performance might be different: (1) the effect is due to reduced cognitive processing ability and is thus task-related; (2) the effect is due to reduced information about the auditory space and caused by declined processing in auditory brain stem circuits; and (3) the auditory space processing regime in young children is still undergoing developmental changes and the interrelation with spatial visual processing is not yet established. In conclusion, we argue that for studying auditory space processing over the life course, it is beneficial to investigate spatial discrimination ability instead of localisation accuracy because it more reliably indicates changes in the processing ability.


Aging/physiology , Auditory Perception/physiology , Sound Localization/physiology , Humans
7.
Front Neurosci ; 8: 146, 2014.
Article En | MEDLINE | ID: mdl-24982611

From behavioral studies it is known that auditory spatial resolution of azimuthal space declines over age. To date, it is not clear how age affects the respective sensory auditory processing at the pre-attentive level. Here we tested the hypothesis that pre-attentive processing of behaviorally perceptible spatial changes is preserved in older adults. An EEG-study was performed in older adults (65-82 years of age) and a mismatch negativity (MMN) paradigm employed. Sequences of frequent standard stimuli of defined azimuthal positions were presented together with rarely occurring deviants shifted by 10° or 20° to the left or to the right of the standard. Standard positions were at +5° (central condition) from the midsagittal plane and at 65° in both lateral hemifields (±65°; lateral condition). The results suggest an effect of laterality on the pre-attentive change processing of spatial deviations in older adults: While for the central conditions deviants close to MAA threshold (i.e., 10°) yielded discernable MMNs, for lateral positions the respective MMN responses were only elicited by spatial deviations of 20° toward the midline (i.e., ±45°). Furthermore, MMN amplitudes were found to be insensitive to the magnitude of deviation (10°, 20°), which is contrary to recent studies with young adults (Bennemann et al., 2013) and hints to a deteriorated pre-attentive encoding of sound sources in older adults. The discrepancy between behavioral MAA data and present results are discussed with respect to the possibility that under the condition of active stimulus processing older adults might benefit from recruiting additional attentional top-down processes to detect small magnitudes of spatial deviations even within the lateral acoustic field.

8.
Neuroreport ; 25(11): 833-837, 2014 Aug 06.
Article En | MEDLINE | ID: mdl-24893202

It has been repeatedly shown that a unimodal stimulus can modulate oscillatory activity of multiple cortical areas already at early stages of sensory processing. In this way, an influence can be exerted on the response to a subsequent sensory input. Even though this fact is now well established, it is still not clear whether cortical sensory areas are informed about spatial positions of objects of modality other than their preferred one. Here, we test the hypothesis of whether oscillatory activity of the human visual cortex depends on the position of a unimodal auditory object. We recorded electroencephalogram while presenting sounds in an acoustic free-field either at the center of the visual field or at lateral positions. Using independent component analysis, we identified three cortical sources located in the visual cortex, showing stimulus position-specific oscillatory responses. The most pronounced effect was an immediate α (8-12 Hz) power decrease over the entire occipital lobe when the stimulus originated from the center of the binocular visual field. Following a lateral stimulation, the amplitude of α activity decreased slightly over contralateral visual areas, while at the same time a weak α synchronization was observed in corresponding ipsilateral areas. Thus, even in the absence of visual stimuli, the visual cortex is differentially activated depending on the position of an acoustic sound source. Our results show that the visual cortex receives information about the position of auditory stimuli within the visual field.

9.
Exp Brain Res ; 232(4): 1157-72, 2014 Apr.
Article En | MEDLINE | ID: mdl-24449009

Localization accuracy and acuity for low- (0.375-0.75 kHz; LN) and high-frequency (2.25-4.5 kHz; HN) noise bands were examined in young (20-29 years) and older adults (65-83 years) in the acoustic free-field. A pointing task was applied to quantify accuracy, while acuity was inferred from minimum audible angle (MAA) thresholds measured with an adaptive 3-alternative forced-choice procedure. Accuracy decreased with laterality and age. From young to older adults, the accuracy declined by up to 23 % for the low-frequency noise band across all lateralities. The mean age effect was even more pronounced on MAA thresholds. Thus, age was a strong predictor for MAA thresholds for both LN and HN bands. There was no significant correlation between hearing status and localization performance. These results suggest that central auditory processing of space declines with age and is mainly driven by age-related changes in the processing of binaural cues (interaural time difference and interaural intensity difference) and not directly induced by peripheral hearing loss. We conclude that the representation of the location of sound sources becomes blurred with age as a consequence of declined temporal processing, the effect of which becomes particularly evident for MAA thresholds, where two closely adjoining sound sources have to be separated. While localization accuracy and MAA were not correlated in older adults, only a weak correlation was found in young adults. These results point to an employment of different processing strategies for localization accuracy and acuity.


Acoustic Stimulation/methods , Audiometry/methods , Auditory Perception/physiology , Discrimination, Psychological/physiology , Sound Localization/physiology , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Young Adult
10.
Atten Percept Psychophys ; 75(7): 1507-19, 2013 Oct.
Article En | MEDLINE | ID: mdl-23864263

Similarities have been observed in the localization of the final position of moving visual and moving auditory stimuli: Perceived endpoints that are judged to be farther in the direction of motion in both modalities likely reflect extrapolation of the trajectory, mediated by predictive mechanisms at higher cognitive levels. However, actual comparisons of the magnitudes of displacement between visual tasks and auditory tasks using the same experimental setup are rare. As such, the purpose of the present free-field study was to investigate the influences of the spatial location of motion offset, stimulus velocity, and motion direction on the localization of the final positions of moving auditory stimuli (Experiment 1 and 2) and moving visual stimuli (Experiment 3). To assess whether auditory performance is affected by dynamically changing binaural cues that are used for the localization of moving auditory stimuli (interaural time differences for low-frequency sounds and interaural intensity differences for high-frequency sounds), two distinct noise bands were employed in Experiments 1 and 2. In all three experiments, less precise encoding of spatial coordinates in paralateral space resulted in larger forward displacements, but this effect was drowned out by the underestimation of target eccentricity in the extreme periphery. Furthermore, our results revealed clear differences between visual and auditory tasks. Displacements in the visual task were dependent on velocity and the spatial location of the final position, but an additional influence of motion direction was observed in the auditory tasks. Together, these findings indicate that the modality-specific processing of motion parameters affects the extrapolation of the trajectory.


Cues , Motion Perception/physiology , Sound Localization/physiology , Space Perception/physiology , Acoustic Stimulation/methods , Adult , Female , Fixation, Ocular/physiology , Humans , Male , Motion , Photic Stimulation/methods , Young Adult
11.
Front Psychol ; 4: 338, 2013.
Article En | MEDLINE | ID: mdl-23781211

The encoding of auditory spatial acuity (measured as the precision to distinguish between two spatially distinct stimuli) by neural circuits in both auditory cortices is a matter of ongoing research. Here, the event-related potential (ERP) mismatch negativity (MMN), a sensitive indicator of preattentive auditory change detection, was used to tap into the underlying mechanism of cortical representation of auditory spatial information. We characterized the MMN response affected by the degree of spatial deviance in lateral acoustic space using a passive oddball paradigm. Two stimulation conditions (SCs)-specifically focusing on the investigation of the mid- and far-lateral acoustic space-were considered: (1) 65° left standard position with deviant positions at 70, 75, and 80°; and (2) 95° left standard position with deviant positions at 90, 85, and 80°. Additionally, behavioral data on the minimum audible angle (MAA) were acquired for the respective standard positions (65, 95° left) to quantify spatial discrimination in separating distinct sound sources. The two measurements disclosed the linkage between the (preattentive) MMN response and the (attentive) behavioral threshold. At 65° spatial deviations as small as 5° reliably elicited MMNs. Thereby, the MMN amplitudes monotonously increased as a function of spatial deviation. At 95°, spatial deviations of 15° were necessary to elicit a valid MMN. The behavioral data, however, yielded no difference in mean MAA thresholds for position 65 and 95°. The different effects of laterality on MMN responses and MAA thresholds suggest a role of spatial selective attention mechanisms particularly relevant in active discrimination of neighboring sound sources, especially in the lateral acoustic space.

12.
Brain Res ; 1466: 99-111, 2012 Jul 23.
Article En | MEDLINE | ID: mdl-22617375

Motion perception can be altered by information received through multiple senses. So far, the interplay between the visual and the auditory modality in peripheral motion perception is scarcely described. The present free-field study investigated audio-visual motion perception for different azimuthal trajectories in space. To disentangle effects related to crossmodal interactions (the influence of one modality on signal processing in another modality) and multisensory integration (binding of bimodal streams), we manipulated the subjects' attention in two experiments on a single set of moving audio-visual stimuli. Acoustic and visual signals were either congruent or spatially and temporally disparate at motion offset. (i) Crossmodal interactions were studied in a selective attention task. Subjects were instructed to attend to either the acoustic or the visual stream and to indicate the perceived final position of motion. (ii) Multisensory integration was studied in a divided attention task in which subjects were asked to report whether they perceived unified or separated audio-visual motion offsets. The results indicate that crossmodal interactions in motion perception do not depend on the integration of the audio-visual stream. Furthermore, in the crossmodal task, both visual and auditory motion perception were susceptible to modulation by irrelevant streams, provided that temporal disparities did not exceed a critical range. Concurrent visual streams modulated auditory motion perception in the central field, whereas concurrent acoustic streams attracted visual motion information in the periphery. Differential abilities between the visual and auditory system when attempting to accurately track positional information along different trajectories account for the observed biasing effects.


Attention/physiology , Auditory Perception/physiology , Motion Perception/physiology , Visual Perception/physiology , Acoustic Stimulation , Adult , Discrimination, Psychological/physiology , Female , Humans , Male , Photic Stimulation
13.
Article En | MEDLINE | ID: mdl-21577251

The present study focuses on auditory discrimination abilities in older adults aged 65-89 years. We applied the "Leipzig inventory for patient psychoacoustic" (LIPP), a psychoacoustic test battery specifically designed to identify deficits in central auditory processing. These tests quantify the just noticeable differences (JND) for the three basic acoustic parameters (i.e., frequency, intensity, and signal duration). Three different test modes [monaural, dichotic signal/noise (s/n) and interaural] were used, stimulus level was 35 dB sensation level. The tests are designed as three-alternative forced-choice procedure with a maximum-likelihood procedure estimating p = 0.5 correct response value. These procedures have proven to be highly efficient and provide a reliable outcome. The measurements yielded significant age-dependent deteriorations in the ability to discriminate single acoustic features pointing to progressive impairments in central auditory processing. The degree of deterioration was correlated to the different acoustic features and to the test modes. Most prominent, interaural frequency and signal duration discrimination at low test frequencies was elevated which indicates a deterioration of time- and phase-dependent processing at brain stem and cortical levels. LIPP proves to be an effective tool to identify basic pathophysiological mechanisms and the source of a specific impairment in auditory processing of the elderly.

...