Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 20
Filter
Add more filters










Publication year range
1.
Behav Res Methods ; 56(1): 290-300, 2024 Jan.
Article in English | MEDLINE | ID: mdl-36595180

ABSTRACT

Interval timing refers to the ability to perceive and remember intervals in the seconds to minutes range. Our contemporary understanding of interval timing is derived from relatively small-scale, isolated studies that investigate a limited range of intervals with a small sample size, usually based on a single task. Consequently, the conclusions drawn from individual studies are not readily generalizable to other tasks, conditions, and task parameters. The current paper presents a live database that presents raw data from interval timing studies (currently composed of 68 datasets from eight different tasks incorporating various interval and temporal order judgments) with an online graphical user interface to easily select, compile, and download the data organized in a standard format. The Timing Database aims to promote and cultivate key and novel analyses of our timing ability by making published and future datasets accessible as open-source resources for the entire research community. In the current paper, we showcase the use of the database by testing various core ideas based on data compiled across studies (i.e., temporal accuracy, scalar property, location of the point of subjective equality, malleability of timing precision). The Timing Database will serve as the repository for interval timing studies through the submission of new datasets.


Subject(s)
Time Perception , Humans , Databases, Factual , Time Factors
2.
Psychophysiology ; 60(7): e14251, 2023 07.
Article in English | MEDLINE | ID: mdl-36700294

ABSTRACT

Several studies have described, often separately, the relaxing effects of music or odor on the autonomic nervous system (ANS) activity. Only a few studies compared the presentation of these stimuli and their interaction within a same experimental protocol. Here, we examined whether relaxing music (slow-paced classical pieces) and odor (lavender essential oil) either presented in isolation or in combination would facilitate physiological recovery after cognitive stress. We continuously recorded the electrocardiogram to assess the high-frequency component of heart rate variability (HF-HRV), an index of parasympathetic activity, and electrodermal activity (EDA), an index of sympathetic activity, 10 min before, during and 30 min after a cognitive stress (i.e., completing timely constrained cognitively demanding tasks) in 99 participants allocated to four recovery conditions (control N = 26, music N = 23, odor N = 24, music+odor N = 26). The stressing event triggered both a significant increase in EDA and decrease in HF-HRV (compared to baseline). During the recovery period, the odor elicited a greater decrease in EDA compared to an odorless silent control, whereas no difference in HRV was observed. Conversely, during this period, music elicited a greater increase in HF-HRV compared to control whereas no difference in EDA was observed. Strikingly, in the multimodal music+odor condition, no beneficial effect was observed on ANS indexes 30 min after stress. Overall, our study confirms that both olfactory and musical stimuli have relaxing effects after stress on ANS when presented separately only, which might rely on distinct neural mechanisms and autonomic pathways.


Subject(s)
Music , Humans , Odorants , Autonomic Nervous System/physiology , Heart Rate/physiology , Cognition
3.
Cognition ; 222: 105016, 2022 05.
Article in English | MEDLINE | ID: mdl-35030358

ABSTRACT

The human brain rapidly and automatically categorizes faces vs. other visual objects. However, whether face-selective neural activity predicts the subjective experience of a face - perceptual awareness - is debated. To clarify this issue, here we use face pareidolia, i.e., the illusory perception of a face, as a proxy to relate the neural categorization of a variety of facelike objects to conscious face perception. In Experiment 1, scalp electroencephalogram (EEG) is recorded while pictures of human faces or facelike objects - in different stimulation sequences - are interleaved every second (i.e., at 1 Hz) in a rapid 6-Hz train of natural images of nonface objects. Participants do not perform any explicit face categorization task during stimulation, and report whether they perceived illusory faces post-stimulation. A robust categorization response to facelike objects is identified at 1 Hz and harmonics in the EEG frequency spectrum with a facelike occipito-temporal topography. Across all individuals, the facelike categorization response is of about 20% of the response to human faces, but more strongly right-lateralized. Critically, its amplitude is much larger in participants who report having perceived illusory faces. In Experiment 2, facelike or matched nonface objects from the same categories appear at 1 Hz in sequences of nonface objects presented at variable stimulation rates (60 Hz to 12 Hz) and participants explicitly report after each sequence whether they perceived illusory faces. The facelike categorization response already emerges at the shortest stimulus duration (i.e., 17 ms at 60 Hz) and predicts the behavioral report of conscious perception. Strikingly, neural facelike-selectivity emerges exclusively when participants report illusory faces. Collectively, these experiments characterize a neural signature of face pareidolia in the context of rapid categorization, supporting the view that face-selective brain activity reliably predicts the subjective experience of a face from a single glance at a variety of stimuli.


Subject(s)
Facial Recognition , Illusions , Brain/physiology , Brain Mapping , Electroencephalography , Facial Recognition/physiology , Humans , Photic Stimulation/methods
4.
Article in English | MEDLINE | ID: mdl-32787505

ABSTRACT

Studies on aging and hedonic judgment of odors have never been addressed within the empirical frameworks of age-related changes in emotion which state that advancing age is associated with a reduced negativity bias and a less pronounced differentiation between hedonic valence and emotional intensity judgments. Our aim was to examine and extend these age-related effects into the field of odors. Thirty-eight younger adults and 40 older adults were asked to evaluate the hedonic valence, emotional intensity, and familiarity of 50 odors controlled for their pleasantness. Compared to younger adults, older adults rated unpleasant odorants as less unpleasant and showed an increased relationship between hedonic valence and emotional intensity ratings. This yields evidence of reduced negativity bias and emotional dedifferentiation in response to odors. Such data suggest that when faced with odors, older people exhibit a reduction of emotional dimensionality leading them to distort emotional processing in a less negative direction.


Subject(s)
Aging/physiology , Emotions/physiology , Olfactory Perception/physiology , Recognition, Psychology/physiology , Adult , Age Factors , Aged , Female , Humans , Male , Middle Aged , Pleasure/physiology , Young Adult
5.
Atten Percept Psychophys ; 83(1): 448-462, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33159286

ABSTRACT

Although several studies have reported relaxing and stimulating effects of odors on physiology and behavior, little is known about their underlying mechanisms. It has been proposed that participant expectancy could explain these activation effects. Since emotional stimuli are known to modulate time perception, here we used the temporal bisection task to determine whether odors have objective relaxing and stimulating effects by respectively slowing down or speeding up the internal clock and whether prior expectancy could alter these effects. In Experiment 1, 118 participants were presented either with a strawberry odor or an odorless blank. In Experiment 2, 132 participants were presented either with a lemon odor or an odorless blank. In both experiments, expectancy was manipulated using suggestion (verbal instructions). The stimulus was either described as relaxing or stimulating, or was not described. In the absence of prior suggestion, findings showed that, compared to participants presented with an odorless blank, participants presented with the strawberry odor underestimated sound durations (i.e., a relaxing effect) whereas participants presented with the lemon odor overestimated them (i.e., a stimulating effect). These results confirm that pleasant odors can have objective relaxing and stimulating effects by themselves, which are better explained by arousal-based mechanisms rather than attentional distraction. Furthermore, in both experiments, incongruent suggestions undid the effects of both odors without reversing them completely (i.e., strawberry did not become stimulating even if participants were told so). Both these bottom-up and top-down influences should be considered when investigating the emotional impact of odors on human behavior.


Subject(s)
Odorants , Time Perception , Arousal , Attention , Emotions , Humans , Smell
6.
Child Dev ; 91(3): 694-704, 2020 05.
Article in English | MEDLINE | ID: mdl-31900935

ABSTRACT

Self-biases are well described in adults but remain poorly understood in children. Here, we investigated in 6-10 year-old children (N = 132) the self-prioritization effect (SPE), a self-bias which reflects, in adults, the perceptual advantage for stimuli arbitrarily associated with the self as compared to those associated with other persons. We designed a child-friendly adaptation of a paradigm originally introduced in adults by Sui, He, and Humphreys (2012) in order to test whether the SPE also occurs in children and if so, to determine its evolution with age. A robust SPE was obtained from the age of 6, and this effect was similar-sized in our four age groups. These findings are discussed with reference to the development of the self during childhood.


Subject(s)
Child Development/physiology , Self Concept , Child , Female , Humans , Male
7.
Neuroimage ; 179: 235-251, 2018 10 01.
Article in English | MEDLINE | ID: mdl-29913283

ABSTRACT

Efficient decoding of even brief and slight intensity facial expression changes is important for social interactions. However, robust evidence for the human brain ability to automatically detect brief and subtle changes of facial expression remains limited. Here we built on a recently developed paradigm in human electrophysiology with full-blown expressions (Dzhelyova et al., 2017), to isolate and quantify a neural marker for the detection of brief and subtle changes of facial expression. Scalp electroencephalogram (EEG) was recorded from 18 participants during stimulation of a neutral face changing randomly in size at a rapid rate of 6 Hz. Brief changes of expression appeared every five stimulation cycle (i.e., at 1.2 Hz) and expression intensity increased parametrically every 20 s in 20% steps during sweep sequences of 100 s. A significant 1.2 Hz response emerged in the EEG spectrum already at 40% of facial expression-change intensity for most of the 5 emotions tested (anger, disgust, fear, happiness, or sadness in different sequences), and increased with intensity steps, predominantly over right occipito-temporal regions. Given the high signal-to-noise ratio of the approach, thresholds for automatic detection of brief changes of facial expression could be determined for every single individual brain. A time-domain analysis revealed three components, the two first increasing linearly with increasing intensity as early as 100 ms after a change of expression, suggesting gradual low-level image-change detection prior to visual coding of facial movements. In contrast, the third component showed abrupt sensitivity to increasing expression intensity beyond 300 ms post expression-change, suggesting categorical emotion perception. Overall, this characterization of the detection of subtle changes of facial expression and its temporal dynamics open promising tracks for precise assessment of social perception ability during development and in clinical populations.


Subject(s)
Brain/physiology , Facial Expression , Pattern Recognition, Visual/physiology , Social Perception , Adult , Electroencephalography , Female , Humans , Male , Photic Stimulation , Young Adult
8.
Conscious Cogn ; 35: 16-29, 2015 Sep.
Article in English | MEDLINE | ID: mdl-25965942

ABSTRACT

Here we question the mechanisms underlying the emergence of the feeling of control that can be modulated even when the feeling of being the author of one's own action is intact. With a haptic robot, participants made series of vertical pointing actions on a virtual surface, which was sometimes postponed by a small temporal delay (15 or 65 ms). Subjects then evaluated their subjective feeling of control. Results showed that after temporal distortions, the hand-trajectories were adapted effectively but that the feeling of control decreased significantly. This was observed even in the case of subliminal distortions for which subjects did not consciously detect the presence of a distortion. Our findings suggest that both supraliminal and subliminal temporal distortions that occur within a healthy perceptual-motor system impact the conscious experience of the feeling of control of self-initiated motor actions.


Subject(s)
Consciousness , Emotions , Perceptual Distortion , Subliminal Stimulation , Touch Perception , Adult , Female , Hand , Humans , Male , Robotics , Young Adult
9.
Cognition ; 127(2): 214-9, 2013 May.
Article in English | MEDLINE | ID: mdl-23454794

ABSTRACT

The present research aimed to investigate whether, as previously observed with pictures, background auditory rhythm would also influence visual word recognition. In a lexical decision task, participants were presented with bisyllabic visual words, segmented into two successive groups of letters, while an irrelevant strongly metric auditory sequence was played in a loop. The first group of letters could either be congruent with the syllabic division of the word (e.g. val in val/se) or not (e.g. va in va/lse). In agreement with the Dynamic Attending Theory (DAT), our results confirmed that the presentation of the correct first syllable on-beat (i.e. in synchrony with a peak of covert attention) facilitated visual word recognition compared to when it was presented off-beat. However, when an incongruent first syllable was displayed on-beat, this led to an aggravation of impaired recognition. Thus, our results suggest that oscillatory attention tapped into cognitive processes rather than perceptual or decisional and motor stages. We like to think of our paradigm, which combines background auditory rhythm with segmented visual stimuli, as a sort of temporal magnifying glass which allows for the enlargement of the reaction time differences between beneficial and detrimental processing conditions in human cognition.


Subject(s)
Auditory Perception/physiology , Reading , Recognition, Psychology/physiology , Attention/physiology , Female , Humans , Male , Music/psychology , Photic Stimulation , Psychomotor Performance/physiology , Reaction Time/physiology , Speech Perception/physiology , Visual Perception/physiology , Young Adult
10.
J Exp Psychol Learn Mem Cogn ; 37(4): 888-98, 2011 Jul.
Article in English | MEDLINE | ID: mdl-21480752

ABSTRACT

The role of gender categories in prototype formation during face recognition was investigated in 2 experiments. The participants were asked to learn individual faces and then to recognize them. During recognition, individual faces were mixed with faces, which were blended faces of same or different genders. The results of the 2 experiments showed that blended faces made with learned individual faces were recognized, even though they had never been seen before. In Experiment 1, this effect was stronger when faces belonged to the same gender category (same-sex blended faces), but it also emerged across gender categories (cross-sex blended faces). Experiment 2 further showed that this prototype effect was not affected by the presentation order for same-sex blended faces: The effect was equally strong when the faces were presented one after the other during learning or alternated with faces of the opposite gender. By contrast, the prototype effect across gender categories was highly sensitive to the temporal proximity of the faces blended into the blended faces and almost disappeared when other faces were intermixed. These results indicate that distinct neural populations code for female and male faces. However, the formation of a facial representation can also be mediated by both neural populations. The implications for face-space properties and face-encoding processes are discussed.


Subject(s)
Discrimination Learning/physiology , Face , Pattern Recognition, Visual/physiology , Recognition, Psychology/physiology , Sex Characteristics , Adolescent , Adult , Analysis of Variance , Female , Humans , Male , Neuropsychological Tests , Photic Stimulation/methods , Reaction Time/physiology , Young Adult
11.
Ann N Y Acad Sci ; 1169: 74-8, 2009 Jul.
Article in English | MEDLINE | ID: mdl-19673756

ABSTRACT

The perception of meter, or the alternation of strong and weak beats, was assessed in musically trained listeners through magnetoencephalography. Metrical accents were examined with no temporal disruption of the serial grouping of tones. Results showed an effect of metrical processing among identical standard tones in the left hemisphere, with larger responses on strong than on weak beats. Moreover, processing of occasional increases in intensity (phenomenal accents) varied as a function of metrical position in the left hemisphere, but not in the right. Our findings support the view of a relatively early, left-hemispheric effect of metrical processing in musicians.


Subject(s)
Auditory Perception/physiology , Brain/physiology , Functional Laterality/physiology , Music , Acoustic Stimulation , Adult , Brain Mapping , Female , Humans , Magnetoencephalography , Male , Periodicity , Young Adult
12.
Cortex ; 45(1): 103-9, 2009 Jan.
Article in English | MEDLINE | ID: mdl-19027894

ABSTRACT

Previous research suggests that our past experience of rhythmic structure in music results in a tendency for Western listeners to subjectively accent equitonal isochronous sequences. We have shown in an earlier study that the occurrence of a slightly softer tone in the 8th to 11th position of such a sequence evokes a P300 event-related potential (ERP) response of different amplitudes depending on whether the tone occurs in putatively subjectively accented or unaccented sequence positions (Brochard et al., 2003). One current theory of rhythm processing postulates that subjective accenting is the result of predictive modulations of perceptual processes by the attention system. If this is the case then ERP modulations should be observed at an earlier latency than the P300 and these should be observed in ERPs to both standard and softer tones. Such effects were not observed in our previous study. This was possibly due to the use of a linked-mastoid reference which may have obscured lateralized differences. The aim of the present study was to replicate the previous auditory P300 subjective accenting findings and to investigate the possibility that these effects are preceded by ERP changes that are indicative of rhythmic modulation of perceptual processing. Previous auditory P300 findings were replicated. In addition and consistent with current theories of rhythm processing, early brain ERP differences were observed both in standard and deviant tones from the onset of the stimulus. These left lateralized differences are consistent with a rhythmic, endogenously driven, modulation of perception that influences the conscious experience of equitonal isochronous sequences.


Subject(s)
Auditory Perception/physiology , Event-Related Potentials, P300/physiology , Evoked Potentials, Auditory/physiology , Music/psychology , Acoustic Stimulation , Adult , Education , Electroencephalography , Female , Humans , Male , Middle Aged , Young Adult
13.
Brain Res ; 1223: 59-64, 2008 Aug 05.
Article in English | MEDLINE | ID: mdl-18590909

ABSTRACT

Humans can easily tap in synchrony with an auditory beat but not with an equivalent visual rhythmic sequence, suggesting that the sensation of meter (i.e. of an underlying regular pulse) may be inherently auditory. We assessed whether the perception of meter could also be felt with tactile sensory inputs. We found that, when participants were presented with identical rhythmic sequences filled with either short tones or hand stimulations, they could more efficiently tap in synchrony with strongly rather than weakly metric sequences. These observations suggest that non-musician adults can extract the metric structure of purely tactile rhythms and use it to tap regularly with the beat induced by such sequences. This finding represents a challenge for present models of rhythm processing.


Subject(s)
Movement/physiology , Music/psychology , Periodicity , Psychomotor Performance/physiology , Time Perception/physiology , Touch/physiology , Acoustic Stimulation , Female , Fingers/physiology , Humans , Male , Mechanoreceptors/physiology , Muscle, Skeletal/innervation , Muscle, Skeletal/physiology , Neuropsychological Tests , Observer Variation , Physical Stimulation , Reaction Time/physiology , Sensory Receptor Cells/physiology
14.
Percept Mot Skills ; 106(1): 171-87, 2008 Feb.
Article in English | MEDLINE | ID: mdl-18459366

ABSTRACT

This study assessed the influence of tempo on selecting a sound sequence. In Exp. 1, synchronization with one of the two regular subsequences in a complex sequence was measured. 30 participants indicated a preference for the fastest subsequence when subsequences were in a slow tempo range (> or = 500 msec. IOI), and with the slower subsequence when they were in the fast tempo range (< or = 300 msec. IOI). These results were replicated using a perceptual task (Exp. 2 and 3) in which the 30 listeners had to detect a temporal irregularity in one of the two subsequences. Detection was better when the temporal irregularity was in the fastest subsequence than in the slowest one when the complex sequence was in a slow tempo range (> or = 500 msec. IOI) and the reverse was obtained when the complex sequence was in a fast tempo range (< or = 180 msec. IOI). These results have implications for design of auditory alarms.


Subject(s)
Acoustic Stimulation/methods , Auditory Perception/physiology , Choice Behavior/physiology , Pattern Recognition, Physiological , Psychomotor Performance/physiology , Sound , Adolescent , Adult , Attention/physiology , Discrimination, Psychological/physiology , Equipment Design/methods , Equipment Failure , Female , Humans , Male , Motor Skills/physiology , Psychoacoustics , Task Performance and Analysis , Time Perception
15.
Conscious Cogn ; 17(3): 790-7, 2008 Sep.
Article in English | MEDLINE | ID: mdl-17977748

ABSTRACT

There is growing interest in the effect of sound on visual motion perception. One model involves the illusion created when two identical objects moving towards each other on a two-dimensional visual display can be seen to either bounce off or stream through each other. Previous studies show that the large bias normally seen toward the streaming percept can be modulated by the presentation of an auditory event at the moment of coincidence. However, no reports to date provide sufficient evidence to indicate whether the sound bounce-inducing effect is due to a perceptual binding process or merely to an explicit inference resulting from the transient auditory stimulus resembling a physical collision of two objects. In the present study, we used a novel experimental design in which a subliminal sound was presented either 150 ms before, at, or 150 ms after the moment of coincidence of two disks moving towards each other. The results showed that there was an increased perception of bouncing (rather than streaming) when the subliminal sound was presented at or 150 ms after the moment of coincidence compared to when no sound was presented. These findings provide the first empirical demonstration that activation of the human auditory system without reaching consciousness affects the perception of an ambiguous visual motion display.


Subject(s)
Motion Perception , Sound , Unconscious, Psychology , Visual Perception , Adult , Auditory Perception , Auditory Threshold , Female , Humans , Male
16.
Percept Mot Skills ; 98(3 Pt 1): 971-82, 2004 Jun.
Article in English | MEDLINE | ID: mdl-15209314

ABSTRACT

We measured brain activation during the perception of fingerspelled letters, printed letters, and abstract shapes (control condition) in six congenitally, profoundly deaf signers and six normal hearing subjects. Normal hearing subjects showed essentially extrastriate activation in the fingerspelled letters and printed letters conditions whereas deaf subjects showed activation of a broader network in printed letters and fingerspelled conditions, comprising supplementary frontal and posterior areas, and the supramarginal gyrus (Brodmann Area 6). These results suggest that, on one hand, different cerebral areas in deaf and hearing subjects mediate processing of printed letters and, on the other hand, common cerebral areas are activated in deaf signers when they are engaged in processing fingerspelled or printed letters.


Subject(s)
Brain/anatomy & histology , Deafness , Magnetic Resonance Imaging , Perception , Sign Language , Humans
17.
Brain Cogn ; 54(2): 103-9, 2004 Mar.
Article in English | MEDLINE | ID: mdl-14980450

ABSTRACT

Recently, the relationship between music and nonmusical cognitive abilities has been highly debated. It has been documented that formal music training would improve verbal, mathematical or visuospatial performance in children. In the experiments described here, we tested if visual perception and imagery abilities were enhanced in adult musicians compared with nonmusicians. In our main experiment, we measured reaction times of subjects who had to detect on which side of a horizontal or a vertical line a target dot was flashed. In the "imagery" condition the reference line disappeared before the target dot was presented. In order to accomplish the task, subjects had to keep a mental image of the position of the line until the dot appeared. In the "perception" condition, the procedure and stimuli were the same except that the line remained on the screen until the dot was displayed. In both groups, reaction times were shorter for horizontal compared to vertical discrimination, but reaction times were significantly shorter in musicians in all conditions. Moreover, discrimination on the vertical dimension, especially in imaging condition, seemed to be greatly improved on the long term by musical expertise. Simple and choice visual reaction times indicate that this advantage could only be partly explained by better sensorimotor integration in adult musicians.


Subject(s)
Aptitude , Imagination , Music , Reaction Time , Space Perception , Visual Perception , Adult , Female , Humans , Male , Posture , Professional Competence
18.
Psychol Sci ; 14(4): 362-6, 2003 Jul.
Article in English | MEDLINE | ID: mdl-12807411

ABSTRACT

The phenomenon commonly known as subjective accenting refers to the fact that identical sound events within purely isochronous sequences are perceived as unequal. Although subjective accenting has been extensively explored using behavioral methods, no physiological evidence has ever been provided for it. In the present study, we tested the notion that these perceived irregularities are related to the dynamic deployment of attention. We disrupted listeners' expectancies in different positions of auditory equitone sequences and measured their responses through brain event-related potentials (ERPs). Significant differences in a late parietal (P3-like) ERP component were found between the responses elicited on odd-numbered versus even-numbered positions, suggesting that a default binary metric structure was perceived. Our findings indicate that this phenomenon has a rather cognitive, attention-dependent origin, partly affected by musical expertise.


Subject(s)
Attention/physiology , Auditory Perception/physiology , Cerebral Cortex/physiology , Electroencephalography , Music , Time Perception/physiology , Adult , Brain Mapping , Event-Related Potentials, P300/physiology , Female , Humans , Male , Parietal Lobe/physiology , Psychoacoustics , Set, Psychology , Sound Spectrography
19.
Cogn Neuropsychiatry ; 8(2): 89-106, 2003 May.
Article in English | MEDLINE | ID: mdl-16571553

ABSTRACT

INTRODUCTION: Patients with schizophrenia demonstrate a wide range of information processing deficits. Most recent studies argue in favour of high level deficits, including attention and context processing, whereas fewer studies have demonstrated deficits at earlier stages of processing, such as perceptual discrimination and organisation. This is the first study to investigate both high and low level processing, within a single paradigm, in the case of auditory temporal processing in schizophrenia. METHODS: Patients with schizophrenia were compared to controls on a series of tasks involving three auditory temporal processes varying from low to higher level: (1) segregation of a complex sequence into component auditory streams; (2) detection of local temporal irregularities within a stream; (3) attentional focusing on one stream by the use of a cue preceding the complex sequence. RESULTS: The lowest level of processing examined here--stream segregation--appeared to function equally well in patients as in controls. However, the higher level processes--irregularity detection and attentional focus--functioned in both groups, but less efficiently in patients with schizophrenia. CONCLUSIONS: Results demonstrate abnormal auditory temporal processing in schizophrenia. Abnormal performances only in Processes 2 and 3 support and hypothesis of higher level rather than lower level processing deficits in schizophrenia.

20.
J Exp Psychol Hum Percept Perform ; 25(6): 1742-1759, 1999 Dec.
Article in English | MEDLINE | ID: mdl-10641316

ABSTRACT

Previous findings on streaming are generalized to sequences composed of more than 2 subsequences. A new paradigm identified whether listeners perceive complex sequences as a single unit (integrative listening) or segregate them into 2 (or more) perceptual units (stream segregation). Listeners heard 2 complex sequences, each composed of 1, 2, 3, or 4 subsequences. Their task was to detect a temporal irregularity within 1 subsequence. In Experiment 1, the smallest frequency separation under which listeners were able to focus on 1 subsequence was unaffected by the number of co-occurring subsequences; nonfocused sounds were not perceptually organized into streams. In Experiment 2, detection improved progressively, not abruptly, as the frequency separation between subsequences increased from 0.25 to 6 auditory filters. The authors propose a model of perceptual organization of complex auditory sequences.


Subject(s)
Attention , Pitch Discrimination , Serial Learning , Adult , Female , Humans , Male , Psychoacoustics , Sound Spectrography , Time Perception
SELECTION OF CITATIONS
SEARCH DETAIL
...