Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Cereb Cortex ; 28(3): 805-818, 2018 03 01.
Article in English | MEDLINE | ID: mdl-28052922

ABSTRACT

When hearing knocking on a door, a listener typically identifies both the action (forceful and repeated impacts) and the object (a thick wooden board) causing the sound. The current work studied the neural bases of sound source identification by switching listeners' attention toward these different aspects of a set of simple sounds during functional magnetic resonance imaging scanning: participants either discriminated the action or the material that caused the sounds, or they simply discriminated meaningless scrambled versions of them. Overall, discriminating action and material elicited neural activity in a left-lateralized frontoparietal network found in other studies of sound identification, wherein the inferior frontal sulcus and the ventral premotor cortex were under the control of selective attention and sensitive to task demand. More strikingly, discriminating materials elicited increased activity in cortical regions connecting auditory inputs to semantic, motor, and even visual representations, whereas discriminating actions did not increase activity in any regions. These results indicate that discriminating and identifying material requires deeper processing of the stimuli than discriminating actions. These results are consistent with previous studies suggesting that auditory perception is better suited to comprehend the actions than the objects producing sounds in the listeners' environment.


Subject(s)
Attention/physiology , Auditory Perception/physiology , Brain Mapping , Cerebral Cortex/physiology , Discrimination, Psychological/physiology , Sound , Acoustic Stimulation , Analysis of Variance , Cerebral Cortex/diagnostic imaging , Female , Functional Laterality , Humans , Image Processing, Computer-Assisted , Linear Models , Magnetic Resonance Imaging , Male , Oxygen/blood , Reaction Time/physiology
2.
Front Psychol ; 7: 231, 2016.
Article in English | MEDLINE | ID: mdl-26941686

ABSTRACT

Actions that produce sounds infuse our daily lives. Some of these sounds are a natural consequence of physical interactions (such as a clang resulting from dropping a pan), but others are artificially designed (such as a beep resulting from a keypress). Although the relationship between actions and sounds has previously been examined, the frame of reference of these associations is still unknown, despite it being a fundamental property of a psychological representation. For example, when an association is created between a keypress and a tone, it is unclear whether the frame of reference is egocentric (gesture-sound association) or exocentric (key-sound association). This question is especially important for artificially created associations, which occur in technology that pairs sounds with actions, such as gestural interfaces, virtual or augmented reality, and simple buttons that produce tones. The frame of reference could directly influence the learnability, the ease of use, the extent of immersion, and many other factors of the interaction. To explore whether action-sound associations are egocentric or exocentric, an experiment was implemented using a computer keyboard's number pad wherein moving a finger from one key to another produced a sound, thus creating an action-sound association. Half of the participants received egocentric instructions to move their finger with a particular gesture. The other half of the participants received exocentric instructions to move their finger to a particular number on the keypad. All participants were performing the same actions, and only the framing of the action varied between conditions by altering task instructions. Participants in the egocentric condition learned the gesture-sound association, as revealed by a priming paradigm. However, the exocentric condition showed no priming effects. This finding suggests that action-sound associations are egocentric in nature. A second part of the same session further confirmed the egocentric nature of these associations by showing no change in the priming effect after moving to a different starting location. Our findings are consistent with an egocentric representation of action-sound associations, which could have implications for applications that utilize these associations.

3.
PLoS One ; 10(11): e0141791, 2015.
Article in English | MEDLINE | ID: mdl-26544884

ABSTRACT

We report a series of experiments about a little-studied type of compatibility effect between a stimulus and a response: the priming of manual gestures via sounds associated with these gestures. The goal was to investigate the plasticity of the gesture-sound associations mediating this type of priming. Five experiments used a primed choice-reaction task. Participants were cued by a stimulus to perform response gestures that produced response sounds; those sounds were also used as primes before the response cues. We compared arbitrary associations between gestures and sounds (key lifts and pure tones) created during the experiment (i.e. no pre-existing knowledge) with ecological associations corresponding to the structure of the world (tapping gestures and sounds, scraping gestures and sounds) learned through the entire life of the participant (thus existing prior to the experiment). Two results were found. First, the priming effect exists for ecological as well as arbitrary associations between gestures and sounds. Second, the priming effect is greatly reduced for ecologically existing associations and is eliminated for arbitrary associations when the response gesture stops producing the associated sounds. These results provide evidence that auditory-motor priming is mainly created by rapid learning of the association between sounds and the gestures that produce them. Auditory-motor priming is therefore mediated by short-term associations between gestures and sounds that can be readily reconfigured regardless of prior knowledge.


Subject(s)
Auditory Perception/physiology , Gestures , Sound , Adolescent , Cues , Female , Humans , Male , Reaction Time , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...