Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 41
Filter
1.
J Exp Psychol Learn Mem Cogn ; 49(7): 1068-1090, 2023 Jul.
Article in English | MEDLINE | ID: mdl-36521155

ABSTRACT

Involuntary musical imagery (INMI; more commonly known as "earworms" or having a song "stuck in your head") is a common musical phenomenon and one of the most salient examples of spontaneous cognition. Despite the ubiquitous nature of INMI in the general population, functional roles of INMI remain to be fully established and characterized. Findings that spontaneous reactivation of mental representations aids in memory consolidation raise the possibility that INMI also serves in this capacity. In three experiments, we manipulated the probability of experiencing INMI for novel music loops by first exposing participants to these loops during tasks that varied in attentional and sensorimotor demands. We measured INMI for loops and the quality of individual loop memories using different tasks both immediately following exposure and at a delay of 1 week. Across experiments, reduced exposure to music had the largest effect on INMI and loop memory. In Experiments 1 and 2, music encoding was resilient to manipulations of attentional focus; however, in Experiment 3, engaging sequence learning processes with an unrelated task during music exposure reduced the subsequent accuracy of loop memories and the likelihood of experiencing INMI. In each experiment, the amount of INMI experienced for a loop across the delay period predicted improvements in the accuracy of a loop memory over time. We thus provide evidence for a memory-consolidation role for INMI, in which the spontaneous replay of recently encoded music is related to the quality of music encoding and predicts changes in music memory over time. (PsycInfo Database Record (c) 2023 APA, all rights reserved).


Subject(s)
Music , Humans , Imagination/physiology , Cognition , Learning , Attention/physiology
2.
Front Hum Neurosci ; 16: 916551, 2022.
Article in English | MEDLINE | ID: mdl-35782041

ABSTRACT

Synchronization of movement enhances cooperation and trust between people. However, the degree to which individuals can synchronize with each other depends on their ability to perceive the timing of others' actions and produce movements accordingly. Here, we introduce an assistive device-a multi-person adaptive metronome-to facilitate synchronization abilities. The adaptive metronome is implemented on Arduino Uno circuit boards, allowing for negligible temporal latency between tapper input and adaptive sonic output. Across five experiments-two single-tapper, and three group (four tapper) experiments, we analyzed the effects of metronome adaptivity (percent correction based on the immediately preceding tap-metronome asynchrony) and auditory feedback on tapping performance and subjective ratings. In all experiments, tapper synchronization with the metronome was significantly enhanced with 25-50% adaptivity, compared to no adaptation. In group experiments with auditory feedback, synchrony remained enhanced even at 70-100% adaptivity; without feedback, synchrony at these high adaptivity levels returned to near baseline. Subjective ratings of being in the groove, in synchrony with the metronome, in synchrony with others, liking the task, and difficulty all reduced to one latent factor, which we termed enjoyment. This same factor structure replicated across all experiments. In predicting enjoyment, we found an interaction between auditory feedback and metronome adaptivity, with increased enjoyment at optimal levels of adaptivity only with auditory feedback and a severe decrease in enjoyment at higher levels of adaptivity, especially without feedback. Exploratory analyses relating person-level variables to tapping performance showed that musical sophistication and trait sadness contributed to the degree to which an individual differed in tapping stability from the group. Nonetheless, individuals and groups benefitted from adaptivity, regardless of their musical sophistication. Further, individuals who tapped less variably than the group (which only occurred ∼25% of the time) were more likely to feel "in the groove." Overall, this work replicates previous single person adaptive metronome studies and extends them to group contexts, thereby contributing to our understanding of the temporal, auditory, psychological, and personal factors underlying interpersonal synchrony and subjective enjoyment during sensorimotor interaction. Further, it provides an open-source tool for studying such factors in a controlled way.

3.
Front Psychol ; 13: 901272, 2022.
Article in English | MEDLINE | ID: mdl-35898999

ABSTRACT

Motivation for bodily movement, physical activity and exercise varies from moment to moment. These motivation states may be "affectively-charged," ranging from instances of lower tension (e.g., desires, wants) to higher tension (e.g., cravings and urges). Currently, it is not known how often these states have been investigated in clinical populations (e.g., eating disorders, exercise dependence/addiction, Restless Legs Syndrome, diabetes, obesity) vs. healthy populations (e.g., in studies of motor control; groove in music psychology). The objective of this scoping review protocol is to quantify the literature on motivation states, to determine what topical areas are represented in investigations of clinical and healthy populations, and to discover pertinent details, such as instrumentation, terminology, theories, and conceptual models, correlates and mechanisms of action. Iterative searches of scholarly databases will take place to determine which combination of search terms (e.g., "motivation states" and "physical activity"; "desire to be physically active," etc.) captures the greatest number of relevant results. Studies will be included if motivation states for movement (e.g., desires, urges) are specifically measured or addressed. Studies will be excluded if referring to motivation as a trait. A charting data form was developed to scan all relevant documents for later data extraction. The primary outcome is simply the extent of the literature on the topic. Results will be stratified by population/condition. This scoping review will unify a diverse literature, which may result in the creation of unique models or paradigms that can be utilized to better understand motivation for bodily movement and exercise.

4.
J Exp Psychol Gen ; 151(1): 1-24, 2022 Jan.
Article in English | MEDLINE | ID: mdl-34110893

ABSTRACT

Why is music effective at evoking memories from one's past? Familiar music is a potent cue that can trigger, often involuntarily, the recollection of associated autobiographical memories. The mechanisms by which associations between music and nonmusical knowledge initially form and consolidate into long-term memory have not been elucidated. In three experiments, we linked two common musical phenomena, involuntary musical imagery (INMI; commonly called "earworms") and music-evoked remembering, in testing the hypothesis that such imagery aids in the consolidation of memory for events with which music becomes associated. We manipulated the probability of experiencing INMI for novel music loops by first exposing participants to these loops during tasks that varied in attentional and sensorimotor demands. Then, 1 week later, these loops served as soundtracks for unfamiliar movies. Immediately after movie viewing, and at subsequent delays of 1-4 weeks, participants recalled movie details, using the soundtracks as retrieval cues. The amount of INMI across the delay periods predicted both the accuracy of the memory for the music itself and the amount of recalled movie knowledge at the temporal granularity of the 30-s epochs during which individual loops played. We conclude that the replay of musical sequence memories during episodes of INMI serves as a consolidation mechanism both for the music and for associated episodic information. We thus demonstrate that spontaneous internally cued memory reactivation is a naturally occurring memory process that improves retention of real-world event knowledge. (PsycInfo Database Record (c) 2022 APA, all rights reserved).


Subject(s)
Memory, Episodic , Music , Cues , Humans , Memory, Long-Term , Mental Recall/physiology
5.
J Exp Psychol Hum Percept Perform ; 44(11): 1694-1711, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30091636

ABSTRACT

Many environmental sounds, such as music or speech, are patterned in time. Dynamic attending theory, and supporting empirical evidence, suggests that a stimulus's temporal structure serves to orient attention to specific moments in time. One instantiation of this theory posits that attention synchronizes to the temporal structure of a stimulus in an oscillatory fashion, with optimal perception at salient time points or oscillation peaks. We examined whether a model consisting of damped linear oscillators succeeds at predicting temporal attention behavior in rhythmic multi-instrumental music. We conducted 3 experiments in which we mapped listeners' perceptual sensitivity by estimating detection thresholds for intensity deviants embedded at multiple time points within a stimulus pattern. We compared participants' thresholds for detecting intensity changes at various time points with the modeled salience prediction at each of those time points. Across all experiments, results showed that the resonator model predicted listener thresholds, such that listeners were more sensitive to probes at time points corresponding to greater model-predicted salience. This effect held for both intensity increment and decrement probes and for metrically simple and complex stimuli. Moreover, the resonator model explained the data better than did predictions based on canonical metric hierarchy or auditory scene density. Our results offer new insight into the temporal orienting of attention in complex auditory scenes using a parsimonious computational model for predicting attentional dynamics. (PsycINFO Database Record (c) 2018 APA, all rights reserved).


Subject(s)
Attention/physiology , Auditory Perception/physiology , Music , Adolescent , Adult , Female , Humans , Male , Models, Theoretical , Time Factors , Young Adult
6.
J Eye Mov Res ; 11(2)2018 Nov 20.
Article in English | MEDLINE | ID: mdl-33828695

ABSTRACT

Rhythm is a ubiquitous feature of music that induces specific neural modes of processing. In this paper, we assess the potential of a stimulus-driven linear oscillator model (57) to predict dynamic attention to complex musical rhythms on an instant-by-instant basis. We use perceptual thresholds and pupillometry as attentional indices against which to test our model predictions. During a deviance detection task, participants listened to continuously looping, multiinstrument, rhythmic patterns, while being eye-tracked. Their task was to respond anytime they heard an increase in intensity (dB SPL). An adaptive thresholding algorithm adjusted deviant intensity at multiple probed temporal locations throughout each rhythmic stimulus. The oscillator model predicted participants' perceptual thresholds for detecting deviants at probed locations, with a low temporal salience prediction corresponding to a high perceptual threshold and vice versa. A pupil dilation response was observed for all deviants. Notably, the pupil dilated even when participants did not report hearing a deviant. Maximum pupil size and resonator model output were significant predictors of whether a deviant was detected or missed on any given trial. Besides the evoked pupillary response to deviants, we also assessed the continuous pupillary signal to the rhythmic patterns. The pupil exhibited entrainment at prominent periodicities present in the stimuli and followed each of the different rhythmic patterns in a unique way. Overall, these results replicate previous studies using the linear oscillator model to predict dynamic attention to complex auditory scenes and extend the utility of the model to the prediction of neurophysiological signals, in this case the pupillary time course; however, we note that the amplitude envelope of the acoustic patterns may serve as a similarly useful predictor. To our knowledge, this is the first paper to show entrainment of pupil dynamics by demonstrating a phase relationship between musical stimuli and the pupillary signal.

7.
Cereb Cortex ; 28(11): 3939-3950, 2018 11 01.
Article in English | MEDLINE | ID: mdl-29028939

ABSTRACT

Classic psychedelic drugs (serotonin 2A, or 5HT2A, receptor agonists) have notable effects on music listening. In the current report, blood oxygen level-dependent (BOLD) signal was collected during music listening in 25 healthy adults after administration of placebo, lysergic acid diethylamide (LSD), and LSD pretreated with the 5HT2A antagonist ketanserin, to investigate the role of 5HT2A receptor signaling in the neural response to the time-varying tonal structure of music. Tonality-tracking analysis of BOLD data revealed that 5HT2A receptor signaling alters the neural response to music in brain regions supporting basic and higher-level musical and auditory processing, and areas involved in memory, emotion, and self-referential processing. This suggests a critical role of 5HT2A receptor signaling in supporting the neural tracking of dynamic tonal structure in music, as well as in supporting the associated increases in emotionality, connectedness, and meaningfulness in response to music that are commonly observed after the administration of LSD and other psychedelics. Together, these findings inform the neuropsychopharmacology of music perception and cognition, meaningful music listening experiences, and altered perception of music during psychedelic experiences.


Subject(s)
Auditory Perception/drug effects , Auditory Perception/physiology , Brain/drug effects , Brain/physiology , Lysergic Acid Diethylamide/administration & dosage , Music , Receptor, Serotonin, 5-HT2A/physiology , Serotonin Receptor Agonists/administration & dosage , Brain Mapping , Double-Blind Method , Emotions/drug effects , Hallucinogens/administration & dosage , Humans , Ketanserin/administration & dosage , Magnetic Resonance Imaging , Memory/drug effects , Serotonin 5-HT2 Receptor Antagonists/administration & dosage
8.
Neuropsychologia ; 91: 234-246, 2016 Oct.
Article in English | MEDLINE | ID: mdl-27526666

ABSTRACT

Nostalgia is an emotion that is most commonly associated with personally and socially relevant memories. It is primarily positive in valence and is readily evoked by music. It is also an idiosyncratic experience that varies between individuals based on affective traits. We identified frontal, limbic, paralimbic, and midbrain brain regions in which the strength of the relationship between ratings of nostalgia evoked by music and blood-oxygen-level-dependent (BOLD) signal was predicted by affective personality measures (nostalgia proneness and the sadness scale of the Affective Neuroscience Personality Scales) that are known to modulate the strength of nostalgic experiences. We also identified brain areas including the inferior frontal gyrus, substantia nigra, cerebellum, and insula in which time-varying BOLD activity correlated more strongly with the time-varying tonal structure of nostalgia-evoking music than with music that evoked no or little nostalgia. These findings illustrate one way in which the reward and emotion regulation networks of the brain are recruited during the experiencing of complex emotional experiences triggered by music. These findings also highlight the importance of considering individual differences when examining the neural responses to strong and idiosyncratic emotional experiences. Finally, these findings provide a further demonstration of the use of time-varying stimulus-specific information in the investigation of music-evoked experiences.


Subject(s)
Auditory Perception/physiology , Brain/physiology , Emotions/physiology , Individuality , Memory/physiology , Music/psychology , Adult , Brain Mapping , Cerebrovascular Circulation/physiology , Female , Humans , Linear Models , Magnetic Resonance Imaging , Male , Neuropsychological Tests , Oxygen/blood , Personality Tests , Self Report , Young Adult
9.
Handb Clin Neurol ; 129: 187-205, 2015.
Article in English | MEDLINE | ID: mdl-25726270

ABSTRACT

Music is a multifaceted psychologic phenomenon, and separating the perceptual aspects of musical experiences from other aspects of those experiences is difficult, given music's propensity to trigger memories, movements, and emotions. Given that music is primarily an auditory phenomenon, it is reasonable to assume that the auditory cortex will play a major role in the representation of musical auditory scenes. The primary objective of this chapter was to survey the literature and perform a meta-analysis of the neuroimaging literature in order to determine whether a delineation of the lateral temporal lobes emerges in terms of the processing of tonal, temporal, and timbral aspects of musical information. The meta-analysis revealed both overlapping and non-overlapping areas of auditory cortex, with a tendency for melodic and harmonic manipulations to activate areas outside the primary auditory cortex. Regions of the superior temporal gyrus and superior temporal sulcus rostral and ventral to the auditory cortex appear to play an important role in the perception of melodic intervals and patterns, and harmonies, but may not play a direct role in maintaining or evaluating higher-order tonal relationships that govern key membership or relationships between major and minor keys.


Subject(s)
Auditory Pathways/physiology , Auditory Perception/physiology , Brain Mapping , Brain/physiology , Music , Auditory Pathways/anatomy & histology , Humans
10.
Psychon Bull Rev ; 22(1): 163-9, 2015 Feb.
Article in English | MEDLINE | ID: mdl-24865280

ABSTRACT

Melody recognition entails the encoding of pitch intervals between successive notes. While it has been shown that a whole melodic sequence is better encoded than the sum of its constituent intervals, the underlying reasons have remained opaque. Here, we compared listeners' accuracy in encoding the relative pitch distance between two notes (for example, C, E) of an interval to listeners accuracy under the following three modifications: (1) doubling the duration of each note (C - E -), (2) repetition of each note (C, C, E, E), and (3) adding a preceding note (G, C, E). Repeating (2) or adding an extra note (3) improved encoding of relative pitch distance when the melodic sequences were transposed to other keys, but lengthening the duration (1) did not improve encoding relative to the standard two-note interval sequences. Crucially, encoding accuracy was higher with the four-note sequences than with long two-note sequences despite the fact that sensory (pitch) information was held constant. We interpret the results to show that re-forming the Gestalts of two-note intervals into two-note "melodies" results in more accurate encoding of relational pitch information due to a richer structural context in which to embed the interval.


Subject(s)
Auditory Perception , Music , Pitch Perception , Recognition, Psychology , Adult , Female , Humans , Male , Perceptual Distortion , Time Perception , Young Adult
11.
J Exp Psychol Hum Percept Perform ; 40(4): 1679-96, 2014 Aug.
Article in English | MEDLINE | ID: mdl-24979362

ABSTRACT

Music often evokes spontaneous movements in listeners that are synchronized with the music, a phenomenon that has been characterized as being in "the groove." However, the musical factors that contribute to listeners' initiation of stimulus-coupled action remain unclear. Evidence suggests that newly appearing objects in auditory scenes orient listeners' attention, and that in multipart music, newly appearing instrument or voice parts can engage listeners' attention and elicit arousal. We posit that attentional engagement with music can influence listeners' spontaneous stimulus-coupled movement. Here, 2 experiments-involving participants with and without musical training-tested the effect of staggering instrument entrances across time and varying the number of concurrent instrument parts within novel multipart music on listeners' engagement with the music, as assessed by spontaneous sensorimotor behavior and self-reports. Experiment 1 assessed listeners' moment-to-moment ratings of perceived groove, and Experiment 2 examined their spontaneous tapping and head movements. We found that, for both musically trained and untrained participants, music with more instruments led to higher ratings of perceived groove, and that music with staggered instrument entrances elicited both increased sensorimotor coupling and increased reports of perceived groove. Although untrained participants were more likely to rate music as higher in groove, trained participants showed greater propensity for tapping along, and they did so more accurately. The quality of synchronization of head movements with the music, however, did not differ as a function of training. Our results shed new light on the relationship between complex musical scenes, attention, and spontaneous sensorimotor behavior.


Subject(s)
Attention/physiology , Auditory Perception/physiology , Motor Activity/physiology , Music/psychology , Adult , Female , Humans , Male , Time Factors , Young Adult
12.
Psychol Rev ; 121(1): 33-65, 2014 Jan.
Article in English | MEDLINE | ID: mdl-24490788

ABSTRACT

Listeners' expectations for melodies and harmonies in tonal music are perhaps the most studied aspect of music cognition. Long debated has been whether faster response times (RTs) to more strongly primed events (in a music theoretic sense) are driven by sensory or cognitive mechanisms, such as repetition of sensory information or activation of cognitive schemata that reflect learned tonal knowledge, respectively. We analyzed over 300 stimuli from 7 priming experiments comprising a broad range of musical material, using a model that transforms raw audio signals through a series of plausible physiological and psychological representations spanning a sensory-cognitive continuum. We show that RTs are modeled, in part, by information in periodicity pitch distributions, chroma vectors, and activations of tonal space--a representation on a toroidal surface of the major/minor key relationships in Western tonal music. We show that in tonal space, melodies are grouped by their tonal rather than timbral properties, whereas the reverse is true for the periodicity pitch representation. While tonal space variables explained more of the variation in RTs than did periodicity pitch variables, suggesting a greater contribution of cognitive influences to tonal expectation, a stepwise selection model contained variables from both representations and successfully explained the pattern of RTs across stimulus categories in 4 of the 7 experiments. The addition of closure--a cognitive representation of a specific syntactic relationship--succeeded in explaining results from all 7 experiments. We conclude that multiple representational stages along a sensory-cognitive continuum combine to shape tonal expectations in music.


Subject(s)
Anticipation, Psychological/physiology , Models, Psychological , Music/psychology , Pitch Discrimination/physiology , Psychoacoustics , Repetition Priming/physiology , Acoustic Stimulation/methods , Acoustic Stimulation/psychology , Cognition/physiology , Data Interpretation, Statistical , Functional Neuroimaging , Humans , Periodicity , Reaction Time/physiology , Regression Analysis , Western World
13.
Neuroimage ; 84: 688-97, 2014 Jan 01.
Article in English | MEDLINE | ID: mdl-24064075

ABSTRACT

From everyday experience we know that it is generally easier to interact with someone who adapts to our behavior. Beyond this, achieving a common goal will very much depend on who adapts to whom and to what degree. Therefore, many joint action tasks such as musical performance prove to be more successful when defined leader-follower roles are established. In the present study, we present a novel approach to explore the mechanisms of how individuals lead and, using functional magnetic resonance imaging (fMRI), probe the neural correlates of leading. Specifically, we implemented an adaptive virtual partner (VP), an auditory pacing signal, with which individuals were instructed to tap in synchrony while maintaining a steady tempo. By varying the degree of temporal adaptation (period correction) implemented by the VP, we manipulated the objective control individuals had to exert to maintain the overall tempo of the pacing sequence (which was prone to tempo drift with high levels of period correction). Our imaging data revealed that perceiving greater influence and leading are correlated with right lateralized frontal activation of areas involved in cognitive control and self-related processing. Using participants' subjective ratings of influence and task difficulty, we classified a subgroup of our cohort as "leaders", individuals who found the task of synchronizing easier when they felt more in control. Behavioral tapping measures showed that leaders employed less error correction and focused more on self-tapping (prioritizing the instruction to maintain the given tempo) than on the stability of the interaction (prioritizing the instruction to synchronize with the VP), with correlated activity in areas involved in self-initiated action including the pre-supplementary motor area.


Subject(s)
Brain Mapping , Brain/physiology , Interpersonal Relations , Leadership , Magnetic Resonance Imaging , Adult , Female , Humans , Male , Young Adult
14.
Psychol Assess ; 25(3): 826-43, 2013 Sep.
Article in English | MEDLINE | ID: mdl-23647046

ABSTRACT

The Affective Neuroscience Personality Scales (ANPS) were developed to measure behavioral traits related to 6 affective neurobiological systems (play, seek, care, fear, anger, and sadness). However, the ANPS has a number of problems, including an ill-defined factor structure, overly long scales, and items that are poorly worded, ambiguous, and of questionable content validity. To address these issues, we constructed an improved short form of the ANPS--the Brief ANPS (BANPS). Three studies demonstrated that the 33-item BANPS has a clear and coherent factor structure, relatively high reliabilities (for short scales), and theoretically meaningful correlations with a wide range of external criteria, supporting its convergent and discriminant validity. Unlike typical short-form scales, the BANPS improves upon the psychometric properties of the long form, and we recommend its use in all research contexts.


Subject(s)
Affect , Personality Assessment , Anger , Factor Analysis, Statistical , Fear/psychology , Female , Humans , Male , Models, Psychological , Personality , Personality Assessment/standards , Psychometrics , Reproducibility of Results
15.
Hum Factors ; 55(2): 356-72, 2013 Apr.
Article in English | MEDLINE | ID: mdl-23691831

ABSTRACT

OBJECTIVE: The aim of this study was development of a sonification scheme to convey deviations in heart rate and oxygen saturation from a desired target level. BACKGROUND: Maintaining physiologic parameters, such as oxygen saturation, within desired ranges, is challenging in many clinical situations. High rates of false positive alarms in clinical settings limit the utility of the alarms that trigger when thresholds are exceeded. Auditory displays that consider the semantic connotations of sounds and the processing limitations of human perception and cognition may improve monitoring. METHOD: Across two experiments, clinical practitioners were tested on their ability to (a) discriminate pairs of sounds (two-note discrimination task), (b) infer and discern the intended physiological connotation of each acoustic attribute (name-the-variable task), and (c) categorize the amount of change in an implied physiological variable into three levels of change: none, small, and large (change-magnitude task). RESULTS: Considerable variation in performance was observed across the set of practitioners, ranging from near-perfect performance on all tasks, even with no prior exposure to the stimuli, to failure to reach a target accuracy criterion of 87.5% after -80 min of training. On average, performance was well above chance on the name-the-variable and change-magnitude tasks during initial exposure and reached criterion within -20 min of training on each task. CONCLUSION: The described sonification strategy may effectively communicate information about current heart rate and oxygen saturation status relative to desired target levels. APPLICATION: The results can be applied to clinical monitoring settings in which a stream of discrete auditory informational items is indicated.


Subject(s)
Data Display , Discrimination, Psychological , Heart Rate , Oxygen/blood , Sound , Adult , Aged , Decision Making , Female , Humans , Male , Middle Aged , Task Performance and Analysis , Young Adult
16.
J Exp Psychol Hum Percept Perform ; 39(2): 399-412, 2013 Apr.
Article in English | MEDLINE | ID: mdl-22963230

ABSTRACT

Properties of auditory working memory for sounds that lack strong semantic associations and are not readily verbalized or sung are poorly understood. We investigated auditory working memory capacity for lists containing 2-6 easily discriminable abstract sounds synthesized within a constrained timbral space, at delays of 1-6 s (Experiment 1), and the effect of greater perceptual variability among list items on capacity estimates at delays of 1-6 s (Experiment 2). Working memory capacity estimates of 1-2 items were found in all conditions and increased significantly as the perceptual variability among the list items increased. Nonetheless, the capacity estimates were smaller than the commonly observed average working memory capacity limit of 3-5 items. Decay profiles in both experiments were comparable with those previously reported in the verbal and auditory working memory literature. The results help define boundary conditions on capacity estimates for nonverbalizable timbres that lack strong long-term memory associations.


Subject(s)
Attention , Memory, Short-Term , Pitch Discrimination , Acoustic Stimulation , Adolescent , Female , Humans , Loudness Perception , Male , Sound Spectrography , Young Adult
17.
Cereb Cortex ; 23(11): 2592-600, 2013 Nov.
Article in English | MEDLINE | ID: mdl-22892422

ABSTRACT

Cooperation is intrinsic to the human ability to work together toward common goals, and depends on sensing and reacting to dynamically changing relationships between coacting partners. Using functional magnetic resonance imaging (fMRI) and a paradigm in which an adaptive pacing signal simulates a virtual partner, we examined the neural substrates underlying dynamic joint action. A single parameter controlled the degree to which the virtual partner adapted its behavior in relation to participant taps, thus simulating varying degrees of cooperativity. Analyses of fMRI data using objective and subjective measures of synchronization quality found the relative balance of activity in two distinct neural networks to depend on the degree of the virtual partner's adaptivity. At lower degrees of adaptivity, when the virtual partner was easier to synchronize with, cortical midline structures were activated in conjunction with premotor areas, suggesting a link between the action and socio-affective components of cooperation. By contrast, right lateral prefrontal areas associated with central executive control processes were recruited during more cognitively challenging interactions while synchronizing with an overly adaptive virtual partner. Together, the reduced adaptive sensorimotor synchronization paradigm and pattern of results illuminate neural mechanisms that may underlie the socio-emotional consequences of different degrees of entrainment success.


Subject(s)
Adaptation, Psychological/physiology , Brain/physiology , Cooperative Behavior , Emotions/physiology , Feedback, Sensory/physiology , User-Computer Interface , Acoustic Stimulation , Adult , Brain Mapping , Female , Humans , Magnetic Resonance Imaging , Male , Photic Stimulation , Young Adult
18.
Ann N Y Acad Sci ; 1252: 214-21, 2012 Apr.
Article in English | MEDLINE | ID: mdl-22524362

ABSTRACT

Singing in one's mind or forming expectations about upcoming notes both require that mental images of one or more pitches will be generated. As with other musical abilities, the acuity with which such images are formed might be expected to vary across individuals and may depend on musical training. Results from several behavioral tasks involving intonation judgments indicate that multiple memory systems contribute to the formation of accurate mental images for pitch, and that the functionality of each is affected by musical training. Electrophysiological measures indicate that the ability to form accurate mental images is associated with greater engagement of auditory areas and associated error-detection circuitry when listeners imagine ascending scales and make intonation judgments about target notes. A view of auditory mental images is espoused in which unified mental image representations are distributed across multiple brain areas. Each brain area helps shape the acuity of the unified representation based on current behavioral demands and past experience.


Subject(s)
Imagination/physiology , Music/psychology , Pitch Perception/physiology , Attention/physiology , Brain/physiology , Electrophysiological Phenomena , Evoked Potentials/physiology , Humans , Neurosciences , Pitch Discrimination/physiology , Task Performance and Analysis
19.
J Exp Psychol Gen ; 141(1): 54-75, 2012 Feb.
Article in English | MEDLINE | ID: mdl-21767048

ABSTRACT

The urge to move in response to music, combined with the positive affect associated with the coupling of sensory and motor processes while engaging with music (referred to as sensorimotor coupling) in a seemingly effortless way, is commonly described as the feeling of being in the groove. Here, we systematically explore this compelling phenomenon in a population of young adults. We utilize multiple levels of analysis, comprising phenomenological, behavioral, and computational techniques. Specifically, we show (a) that the concept of the groove is widely appreciated and understood in terms of a pleasurable drive toward action, (b) that a broad range of musical excerpts can be appraised reliably for the degree of perceived groove, (c) that the degree of experienced groove is inversely related to experienced difficulty of bimanual sensorimotor coupling under tapping regimes with varying levels of expressive constraint, (d) that high-groove stimuli elicit spontaneous rhythmic movements, and (e) that quantifiable measures of the quality of sensorimotor coupling predict the degree of experienced groove. Our results complement traditional discourse regarding the groove, which has tended to take the psychological phenomenon for granted and has focused instead on the musical and especially the rhythmic qualities of particular genres of music that lead to the perception of groove. We conclude that groove can be treated as a psychological construct and model system that allows for experimental exploration of the relationship between sensorimotor coupling with music and emotion.


Subject(s)
Emotions/physiology , Music/psychology , Psychomotor Performance/physiology , Adolescent , Auditory Perception/physiology , Female , Humans , Male , Periodicity , Young Adult
20.
Cogn Behav Neurol ; 24(2): 74-84, 2011 Jun.
Article in English | MEDLINE | ID: mdl-21617528

ABSTRACT

OBJECTIVE: To compare music recognition in patients with frontotemporal dementia, semantic dementia, Alzheimer disease, and controls and to evaluate the relationship between music recognition and brain volume. BACKGROUND: Recognition of familiar music depends on several levels of processing. There are few studies about how patients with dementia recognize familiar music. METHODS: Subjects were administered tasks that assess pitch and melody discrimination, detection of pitch errors in familiar melodies, and naming of familiar melodies. RESULTS: There were no group differences on pitch and melody discrimination tasks. However, patients with semantic dementia had considerable difficulty naming familiar melodies and also scored the lowest when asked to identify pitch errors in the same melodies. Naming familiar melodies, but not other music tasks, was strongly related to measures of semantic memory. Voxel-based morphometry analysis of brain magnetic resonance imaging showed that difficulty in naming songs was associated with the bilateral temporal lobes and inferior frontal gyrus, whereas difficulty in identifying pitch errors in familiar melodies correlated with primarily the right temporal lobe. CONCLUSIONS: The results support a view that the anterior temporal lobes play a role in familiar melody recognition, and that musical functions are affected differentially across forms of dementia.


Subject(s)
Alzheimer Disease/psychology , Auditory Perception/physiology , Brain/pathology , Frontotemporal Lobar Degeneration/psychology , Music , Recognition, Psychology/physiology , Aged , Alzheimer Disease/pathology , Alzheimer Disease/physiopathology , Brain Mapping , Female , Frontotemporal Lobar Degeneration/pathology , Frontotemporal Lobar Degeneration/physiopathology , Humans , Magnetic Resonance Imaging , Male , Middle Aged , Neuropsychological Tests
SELECTION OF CITATIONS
SEARCH DETAIL
...