Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
J Exp Psychol Hum Percept Perform ; 41(3): 761-89, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25867505

ABSTRACT

Many studies have examined attention mechanisms involved in specific behavioral tasks (e.g., search, tracking, distractor inhibition). However, relatively little is known about the relationships among those attention mechanisms. Is there a fundamental attention faculty that makes a person superior or inferior at most types of attention tasks, or do relatively independent processes mediate different attention skills? We focused on individual differences in voluntary visual-attention abilities using a battery of 11 representative tasks. An application of parallel analysis, hierarchical-cluster analysis, and multidimensional scaling to the intertask correlation matrix revealed 4 functional clusters, representing spatiotemporal attention, global attention, transient attention, and sustained attention, organized along 2 dimensions, one contrasting spatiotemporal and global attention and the other contrasting transient and sustained attention. Comparison with the neuroscience literature suggests that the spatiotemporal-global dimension corresponds to the dorsal frontoparietal circuit and the transient-sustained dimension corresponds to the ventral frontoparietal circuit, with distinct subregions mediating the separate clusters within each dimension. We also obtained highly specific patterns of gender difference and of deficits for college students with elevated attention-deficit/hyperactivity disorder traits. These group differences suggest that different mechanisms of voluntary visual attention can be selectively strengthened or weakened based on genetic, experiential, and/or pathological factors.


Subject(s)
Attention , Psychomotor Performance , Visual Perception , Adolescent , Adult , Aptitude , Attention Deficit Disorder with Hyperactivity/diagnosis , Attention Deficit Disorder with Hyperactivity/psychology , Female , Humans , Individuality , Male , Photic Stimulation , Reaction Time , Sex Factors , Young Adult
2.
Atten Percept Psychophys ; 76(8): 2221-8, 2014 Nov.
Article in English | MEDLINE | ID: mdl-24935805

ABSTRACT

Research has shown that information accessed from one sensory modality can influence perceptual and attentional processes in another modality. Here, we demonstrated a novel crossmodal influence of haptic-shape information on visual attention. Participants visually searched for a target object (e.g., an orange) presented among distractor objects, fixating the target as quickly as possible. While searching for the target, participants held (never viewed and out of sight) an item of a specific shape in their hands. In two experiments, we demonstrated that the time for the eyes to reach a target-a measure of overt visual attention-was reduced when the shape of the held item (e.g., a sphere) was consistent with the shape of the visual target (e.g., an orange), relative to when the held shape was unrelated to the target (e.g., a hockey puck) or when no shape was held. This haptic-to-visual facilitation occurred despite the fact that the held shapes were not predictive of the visual targets' shapes, suggesting that the crossmodal influence occurred automatically, reflecting shape-specific haptic guidance of overt visual attention.


Subject(s)
Attention/physiology , Psychomotor Performance/physiology , Touch Perception/physiology , Visual Perception/physiology , Adult , Female , Humans , Male , Young Adult
3.
Psychon Bull Rev ; 20(1): 108-14, 2013 Feb.
Article in English | MEDLINE | ID: mdl-23090750

ABSTRACT

People naturally dance to music, and research has shown that rhythmic auditory stimuli facilitate production of precisely timed body movements. If motor mechanisms are closely linked to auditory temporal processing, just as auditory temporal processing facilitates movement production, producing action might reciprocally enhance auditory temporal sensitivity. We tested this novel hypothesis with a standard temporal-bisection paradigm, in which the slope of the temporal-bisection function provides a measure of temporal sensitivity. The bisection slope for auditory time perception was steeper when participants initiated each auditory stimulus sequence via a keypress than when they passively heard each sequence, demonstrating that initiating action enhances auditory temporal sensitivity. This enhancement is specific to the auditory modality, because voluntarily initiating each sequence did not enhance visual temporal sensitivity. A control experiment ruled out the possibility that tactile sensation associated with a keypress increased auditory temporal sensitivity. Taken together, these results demonstrate a unique reciprocal relationship between auditory time perception and motor mechanisms. As auditory perception facilitates precisely timed movements, generating action enhances auditory temporal sensitivity.


Subject(s)
Auditory Perception/physiology , Movement/physiology , Time Perception/physiology , Visual Perception/physiology , Female , Humans , Male , Touch Perception/physiology
4.
Acta Psychol (Amst) ; 137(2): 252-9, 2011 Jun.
Article in English | MEDLINE | ID: mdl-20864070

ABSTRACT

Auditory and visual processes demonstrably enhance each other based on spatial and temporal coincidence. Our recent results on visual search have shown that auditory signals also enhance visual salience of specific objects based on multimodal experience. For example, we tend to see an object (e.g., a cat) and simultaneously hear its characteristic sound (e.g., "meow"), to name an object when we see it, and to vocalize a word when we read it, but we do not tend to see a word (e.g., cat) and simultaneously hear the characteristic sound (e.g., "meow") of the named object. If auditory-visual enhancements occur based on this pattern of experiential associations, playing a characteristic sound (e.g., "meow") should facilitate visual search for the corresponding object (e.g., an image of a cat), hearing a name should facilitate visual search for both the corresponding object and corresponding word, but playing a characteristic sound should not facilitate visual search for the name of the corresponding object. Our present and prior results together confirmed these experiential association predictions. We also recently showed that the underlying object-based auditory-visual interactions occur rapidly (within 220ms) and guide initial saccades towards target objects. If object-based auditory-visual enhancements are automatic and persistent, an interesting application would be to use characteristic sounds to facilitate visual search when targets are rare, such as during baggage screening. Our participants searched for a gun among other objects when a gun was presented on only 10% of the trials. The search time was speeded when a gun sound was played on every trial (primarily on gun-absent trials); importantly, playing gun sounds facilitated both gun-present and gun-absent responses, suggesting that object-based auditory-visual enhancements persistently increase the detectability of guns rather than simply biasing gun-present responses. Thus, object-based auditory-visual interactions that derive from experiential associations rapidly and persistently increase visual salience of corresponding objects.


Subject(s)
Auditory Perception/physiology , Eye Movements/physiology , Visual Perception/physiology , Acoustic Stimulation , Discrimination, Psychological/physiology , Humans , Photic Stimulation , Reaction Time/physiology , Young Adult
5.
Atten Percept Psychophys ; 72(7): 1736-41, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20952773

ABSTRACT

When you are looking for an object, does hearing its characteristic sound make you find it more quickly? Our recent results supported this possibility by demonstrating that when a cat target, for example, was presented among other objects, a simultaneously presented "meow" sound (containing no spatial information) reduced the manual response time for visual localization of the target. To extend these results, we determined how rapidly an object-specific auditory signal can facilitate target detection in visual search. On each trial, participants fixated a specified target object as quickly as possible. The target's characteristic sound speeded the saccadic search time within 215-220 msec and also guided the initial saccade toward the target, compared with presentation of a distractor's sound or with no sound. These results suggest that object-based auditory-visual interactions rapidly increase the target object's salience in visual search.


Subject(s)
Association , Attention , Auditory Perception , Pattern Recognition, Visual , Reaction Time , Humans , Saccades
6.
J Vis ; 9(4): 1.1-12, 2009 Apr 03.
Article in English | MEDLINE | ID: mdl-19757910

ABSTRACT

The ability to track multiple moving objects with attention has been the focus of much research. However, the literature is relatively inconclusive regarding two key aspects of this ability, (1) whether the distribution of attention among the tracked targets is fixed during a period of tracking or is dynamically adjusted, and (2) whether motion information (direction and/or speed) is used to anticipate target locations even when velocities constantly change due to inter-object collisions. These questions were addressed by analyzing target-localization errors. Targets in crowded situations (i.e., those in danger of being lost) were localized more precisely than were uncrowded targets. Furthermore, the response vector (pointing from the target location to the reported location) was tuned to the direction of target motion, and observers with stronger direction tuning localized targets more precisely. Overall, our results provide evidence that multiple-object tracking mechanisms dynamically adjust the spatial distribution of attention in a demand-based manner (allocating more resources to targets in crowded situations) and utilize motion information (especially direction information) to anticipate target locations.


Subject(s)
Attention/physiology , Motion Perception/physiology , Visual Pathways/physiology , Fixation, Ocular/physiology , Humans , Models, Neurological , Photic Stimulation/methods
7.
Psychon Bull Rev ; 15(3): 548-54, 2008 Jun.
Article in English | MEDLINE | ID: mdl-18567253

ABSTRACT

In a natural environment, objects that we look for often make characteristic sounds. A hiding cat may meow, or the keys in the cluttered drawer may jingle when moved. Using a visual search paradigm, we demonstrated that characteristic sounds facilitated visual localization of objects, even when the sounds carried no location information. For example, finding a cat was faster when participants heard a meow sound. In contrast, sounds had no effect when participants searched for names rather than pictures of objects. For example, hearing "meow" did not facilitate localization of the word cat. These results suggest that characteristic sounds cross-modally enhance visual (rather than conceptual) processing of the corresponding objects. Our behavioral demonstration of object-based cross-modal enhancement complements the extensive literature on space-based cross-modal interactions. When looking for your keys next time, you might want to play jingling sounds.


Subject(s)
Sound Localization , Sound , Visual Perception , Attention , Humans , Reaction Time
SELECTION OF CITATIONS
SEARCH DETAIL
...