Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 56
Filtrar
Más filtros

Banco de datos
Tipo del documento
Intervalo de año de publicación
1.
Eur J Neurosci ; 57(9): 1529-1545, 2023 05.
Artículo en Inglés | MEDLINE | ID: mdl-36895107

RESUMEN

A growing body of evidence suggests that steady-state evoked potentials may be a useful measure of beat perception, particularly when obtaining traditional, explicit measures of beat perception is difficult, such as with infants or non-human animals. Although attending to a stimulus is not necessary for most traditional applications of steady-state evoked potentials, it is unknown how attention affects steady-state evoked potentials that arise in response to beat perception. Additionally, most applications of steady-state evoked potentials to measure beat perception have used repeating rhythms or real music. Therefore, it is unclear how the steady-state response relates to the robust beat perception that occurs with non-repeating rhythms. Here, we used electroencephalography to record participants' brain activity as they listened to non-repeating musical rhythms while either attending to the rhythms or while distracted by a concurrent visual task. Non-repeating auditory rhythms elicited steady-state evoked potentials at perceived beat frequencies (perception was validated in a separate sensorimotor synchronization task) that were larger when participants attended to the rhythms compared with when they were distracted by the visual task. Therefore, although steady-state evoked potentials appear to index beat perception to non-repeating musical rhythms, this technique may be limited to when participants are known to be attending to the stimulus.


Asunto(s)
Potenciales Evocados , Música , Electroencefalografía , Percepción Auditiva/fisiología , Atención/fisiología
2.
Dev Sci ; 26(5): e13346, 2023 09.
Artículo en Inglés | MEDLINE | ID: mdl-36419407

RESUMEN

Music and language are two fundamental forms of human communication. Many studies examine the development of music- and language-specific knowledge, but few studies compare how listeners know they are listening to music or language. Although we readily differentiate these domains, how we distinguish music and language-and especially speech and song- is not obvious. In two studies, we asked how listeners categorize speech and song. Study 1 used online survey data to illustrate that 4- to 17-year-olds and adults have verbalizable distinctions for speech and song. At all ages, listeners described speech and song differences based on acoustic features, but compared with older children, 4- to 7-year-olds more often used volume to describe differences, suggesting that they are still learning to identify the features most useful for differentiating speech from song. Study 2 used a perceptual categorization task to demonstrate that 4-8-year-olds and adults readily categorize speech and song, but this ability improves with age especially for identifying song. Despite generally rating song as more speech-like, 4- and 6-year-olds rated ambiguous speech-song stimuli as more song-like than 8-year-olds and adults. Four acoustic features predicted song ratings: F0 instability, utterance duration, harmonicity, and spectral flux. However, 4- and 6-year-olds' song ratings were better predicted by F0 instability than by harmonicity and utterance duration. These studies characterize how children develop conceptual and perceptual understandings of speech and song and suggest that children under age 8 are still learning what features are important for categorizing utterances as speech or song. RESEARCH HIGHLIGHTS: Children and adults conceptually and perceptually categorize speech and song from age 4. Listeners use F0 instability, harmonicity, spectral flux, and utterance duration to determine whether vocal stimuli sound like song. Acoustic cue weighting changes with age, becoming adult-like at age 8 for perceptual categorization and at age 12 for conceptual differentiation. Young children are still learning to categorize speech and song, which leaves open the possibility that music- and language-specific skills are not so domain-specific.


Asunto(s)
Música , Percepción del Habla , Voz , Adulto , Niño , Humanos , Adolescente , Preescolar , Habla , Percepción Auditiva , Aprendizaje
3.
Neuroimage ; 252: 119049, 2022 05 15.
Artículo en Inglés | MEDLINE | ID: mdl-35248707

RESUMEN

Music is often described in the laboratory and in the classroom as a beneficial tool for memory encoding and retention, with a particularly strong effect when words are sung to familiar compared to unfamiliar melodies. However, the neural mechanisms underlying this memory benefit, especially for benefits related to familiar music are not well understood. The current study examined whether neural tracking of the slow syllable rhythms of speech and song is modulated by melody familiarity. Participants became familiar with twelve novel melodies over four days prior to MEG testing. Neural tracking of the same utterances spoken and sung revealed greater cerebro-acoustic phase coherence for sung compared to spoken utterances, but did not show an effect of familiar melody when stimuli were grouped by their assigned (trained) familiarity. However, when participant's subjective ratings of perceived familiarity were used to group stimuli, a large effect of familiarity was observed. This effect was not specific to song, as it was observed in both sung and spoken utterances. Exploratory analyses revealed some in-session learning of unfamiliar and spoken utterances, with increased neural tracking for untrained stimuli by the end of the MEG testing session. Our results indicate that top-down factors like familiarity are strong modulators of neural tracking for music and language. Participants' neural tracking was related to their perception of familiarity, which was likely driven by a combination of effects from repeated listening, stimulus-specific melodic simplicity, and individual differences. Beyond simply the acoustic features of music, top-down factors built into the music listening experience, like repetition and familiarity, play a large role in the way we attend to and encode information presented in a musical context.


Asunto(s)
Música , Canto , Percepción Auditiva , Humanos , Reconocimiento en Psicología , Habla
4.
J Cogn Neurosci ; 33(8): 1595-1611, 2021 07 01.
Artículo en Inglés | MEDLINE | ID: mdl-34496377

RESUMEN

We investigated how familiarity alters music and language processing in the brain. We used fMRI to measure brain responses before and after participants were familiarized with novel music and language stimuli. To manipulate the presence of language and music in the stimuli, there were four conditions: (1) whole music (music and words together), (2) instrumental music (no words), (3) a capella music (sung words, no instruments), and (4) spoken words. To manipulate participants' familiarity with the stimuli, we used novel stimuli and a familiarization paradigm designed to mimic "natural" exposure, while controlling for autobiographical memory confounds. Participants completed two fMRI scans that were separated by a stimulus training period. Behaviorally, participants learned the stimuli over the training period. However, there were no significant neural differences between the familiar and unfamiliar stimuli in either univariate or multivariate analyses. There were differences in neural activity in frontal and temporal regions based on the presence of language in the stimuli, and these differences replicated across the two scanning sessions. These results indicate that the way we engage with music is important for creating a memory of that music, and these aspects, over and above familiarity on its own, may be responsible for the robust nature of musical memory in the presence of neurodegenerative disorders such as Alzheimer disease.


Asunto(s)
Música , Percepción Auditiva , Humanos , Lenguaje , Reconocimiento en Psicología , Lóbulo Temporal
5.
Exp Brain Res ; 239(8): 2419-2433, 2021 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-34106299

RESUMEN

Rhythmic auditory stimulation (RAS) is a gait intervention in which gait-disordered patients synchronise footsteps to music or metronome cues. Musical 'groove', the tendency of music to induce movement, has previously been shown to be associated with faster gait, however, why groove affects gait remains unclear. One mechanism by which groove may affect gait is that of beat salience: music that is higher in groove has more salient musical beats, and higher beat salience might reduce the cognitive demands of perceiving the beat and synchronizing footsteps to it. If groove's effects on gait are driven primarily by the impact of beat salience on cognitive demands, then groove's effects might only be present in contexts in which it is relevant to reduce cognitive demands. Such contexts could include task parameters that increase cognitive demands (such as the requirement to synchronise to the beat), or individual differences that may make synchronisation more cognitively demanding. Here, we examined whether high beat salience can account for the effects of high-groove music on gait. First, we increased the beat salience of low-groove music to be similar to that of high-groove music by embedding metronome beats in low and high-groove music. We examined whether low-groove music with high beat salience elicited similar effects on gait as high-groove music. Second, we examined the effect of removing the requirement to synchronise footsteps to the beat (i.e., allowing participants to walk freely with the music), which is thought to remove the cognitive demand of synchronizing movements to the beat. We tested two populations thought to be sensitive to the cognitive demands of synchronisation, weak beat-perceivers and older adults. We found that increasing the beat salience of low-groove music increased stride velocity, but strides were still slower than with high-groove music. Similarly, removing the requirement to synchronise elicited faster, less variable gait, and reduced bias for stability, but high-groove music still elicited faster strides than low-groove music. These findings suggest that beat salience contributes to groove's effect on gait, but it does not fully account for it. Despite reducing task difficulty by equalizing beat salience and removing the requirement to synchronise, high-groove music still elicited faster, less variable gait. Therefore, other properties of groove also appear to play a role in groove's effect on gait.


Asunto(s)
Música , Estimulación Acústica , Anciano , Percepción Auditiva , Señales (Psicología) , Marcha , Humanos , Caminata
6.
Behav Brain Sci ; 44: e73, 2021 09 30.
Artículo en Inglés | MEDLINE | ID: mdl-34588047

RESUMEN

Music uses the evolutionarily unique temporal sensitivity of the auditory system and its tight coupling to the motor system to create a common neurophysiological clock between individuals that facilitates action coordination. We propose that this shared common clock arises from entrainment to musical rhythms, the process by which partners' brains and bodies become temporally aligned to the same rhythmic pulse.


Asunto(s)
Música , Percepción Auditiva , Encéfalo , Humanos
7.
Neuroimage ; 214: 116767, 2020 07 01.
Artículo en Inglés | MEDLINE | ID: mdl-32217165

RESUMEN

Neural activity synchronizes with the rhythmic input of many environmental signals, but the capacity of neural activity to entrain to the slow rhythms of speech is particularly important for successful communication. Compared to speech, song has greater rhythmic regularity, a more stable fundamental frequency, discrete pitch movements, and a metrical structure, this may provide a temporal framework that helps listeners neurally track information better than the rhythmically irregular rhythms of speech. The current study used EEG to examine whether entrainment to the syllable rate of linguistic utterances, as indexed by cerebro-acoustic phase coherence, was greater when listeners heard sung than spoken sentences. We assessed listeners phase-locking in both easy (no time compression) and hard (50% time-compression) utterance conditions. Adults phase-locked equally well to speech and song in the easy listening condition. However, in the time-compressed condition, phase-locking was greater for sung than spoken utterances in the theta band (3.67-5 â€‹Hz). Thus, the musical temporal and spectral characteristics of song related to better phase-locking to the slow phrasal and syllable information (4-7 â€‹Hz) in the speech stream. These results highlight the possibility of using song as a tool for improving speech processing in individuals with language processing deficits, such as dyslexia.


Asunto(s)
Percepción Auditiva/fisiología , Encéfalo/fisiología , Sincronización de Fase en Electroencefalografía/fisiología , Música , Canto , Percepción del Habla/fisiología , Estimulación Acústica/métodos , Adolescente , Adulto , Atención/fisiología , Electroencefalografía/métodos , Femenino , Humanos , Masculino , Periodicidad , Adulto Joven
8.
Annu Rev Psychol ; 69: 51-75, 2018 01 04.
Artículo en Inglés | MEDLINE | ID: mdl-29035690

RESUMEN

The urge to move to music is universal among humans. Unlike visual art, which is manifest across space, music is manifest across time. When listeners get carried away by the music, either through movement (such as dancing) or through reverie (such as trance), it is usually the temporal qualities of the music-its pulse, tempo, and rhythmic patterns-that put them in this state. In this article, we review studies addressing rhythm, meter, movement, synchronization, entrainment, the perception of groove, and other temporal factors that constitute a first step to understanding how and why music literally moves us. The experiments we review span a range of methodological techniques, including neuroimaging, psychophysics, and traditional behavioral experiments, and we also summarize the current studies of animal synchronization, engaging an evolutionary perspective on human rhythmic perception and cognition.


Asunto(s)
Percepción Auditiva/fisiología , Baile/psicología , Movimiento/fisiología , Música/psicología , Periodicidad , Baile/fisiología , Humanos , Percepción del Tiempo/fisiología
9.
Int Psychogeriatr ; 31(9): 1287-1303, 2019 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-30520404

RESUMEN

BACKGROUND: People with dementia fall twice as often and have more serious fall-related injuries than healthy older adults. While gait impairment as a generic term is understood as a fall risk factor in this population, a clear elaboration of the specific components of gait that are associated with falls risk is needed for knowledge translation to clinical practice and the development of fall prevention strategies for people with dementia. OBJECTIVE: To review gait parameters and characteristics associated with falls in people with dementia. METHODS: Electronic databases CINAHL, EMBASE, MedLine, PsycINFO, and PubMed were searched (from inception to April 2017) to identify prospective cohort studies evaluating the association between gait and falls in people with dementia. RESULTS: Increased double support time variability, use of mobility aids, walking outdoors, higher scores on the Unified Parkinson's Disease Rating Scale, and lower average walking bouts were associated with elevated risk of any fall. Increased double support time and step length variability were associated with recurrent falls. The reviewed articles do not support using the Performance Oriented Mobility Assessment and the Timed Up-and-Go tests to predict any fall in this population. There is limited research on the use of dual-task gait assessments for predicting falls in people with dementia. CONCLUSION: This systematic review shows the specific spatiotemporal gait parameters and features that are associated with falls in people with dementia. Future research is recommended to focus on developing specialized treatment methods for these specific gait impairments in this patient population.

10.
Exp Brain Res ; 236(1): 99-115, 2018 01.
Artículo en Inglés | MEDLINE | ID: mdl-29075835

RESUMEN

Anecdotal accounts suggest that individuals spontaneously synchronize their movements to the 'beat' of background music, often without intending to, and perhaps even without attending to the music at all. However, the question of whether intention and attention are necessary to synchronize to the beat remains unclear. Here, we compared whether footsteps during overground walking were synchronized to the beat when young healthy adults were explicitly instructed to synchronize (intention to synchronize), and were not instructed to synchronize (no intention) (Experiment 1: intention). We also examined whether reducing participants' attention to the music affected synchronization, again when participants were explicitly instructed to synchronize, and when they were not (Experiment 2: attention/intention). Synchronization was much less frequent when no instructions to synchronize were given. Without explicit instructions to synchronize, there was no evidence of synchronization in 60% of the trials in Experiment 1, and 43% of the trials in Experiment 2. When instructed to synchronize, only 26% of trials in Experiment 1, and 14% of trials in Experiment 2 showed no evidence of synchronization. Because walking to music alters gait, we also examined how gait kinematics changed with or without instructions to synchronize, and attention to the music was required for synchronization to occur. Instructions to synchronize elicited slower, shorter, and more variable strides than walking in silence. Reducing attention to the music did not significantly affect synchronization of footsteps to the beat, but did elicit slower gait. Thus, during walking, intention, but not attention, appears to be necessary to synchronize footsteps to the beat, and synchronization elicits slower, shorter, and more variable strides, at least in young healthy adults.


Asunto(s)
Atención/fisiología , Marcha/fisiología , Intención , Música , Desempeño Psicomotor/fisiología , Percepción del Tiempo/fisiología , Adulto , Humanos , Adulto Joven
11.
Arch Phys Med Rehabil ; 99(5): 945-951, 2018 05.
Artículo en Inglés | MEDLINE | ID: mdl-29427575

RESUMEN

OBJECTIVES: To assess rhythm abilities, to describe their relation to clinical presentation, and to determine if rhythm production independently contributes to temporal gait asymmetry (TGA) poststroke. DESIGN: Cross-sectional. SETTING: Large urban rehabilitation hospital and university. PARTICIPANTS: Individuals (N=60) with subacute and chronic stroke (n=39) and data for healthy adults extracted from a preexisting database (n=21). INTERVENTIONS: Not applicable. MAIN OUTCOME MEASURES: Stroke group: National Institutes of Health Stroke Scale (NIHSS), Chedoke-McMaster Stroke Assessment (CMSA) leg and foot scales, Montreal Cognitive Assessment (MoCA), rhythm perception and production (Beat Alignment Test [BAT]), and spatiotemporal gait parameters were assessed. TGA was quantified with the swing time symmetry ratio. Healthy group: age and beat perception scores assessed by the BAT. Rhythm perception of the stroke group and healthy adults was compared with analysis of variance. Spearman correlations quantified the relation between rhythm perception and production abilities and clinical measures. Multiple linear regression assessed the contribution of rhythm production along with motor impairment and time poststroke to TGA. RESULTS: Rhythm perception in the stroke group was worse than healthy adults (F1,56=17.5, P=.0001) Within the stroke group, rhythm perception was significantly correlated with CMSA leg (Spearman ρ=.33, P=.04), and foot (Spearman ρ=.49, P=.002) scores but not NIHSS or MoCA scores. The model for TGA was significant (F3,35=12.8, P<.0001) with CMSA leg scores, time poststroke, and asynchrony of rhythm production explaining 52% of the variance. CONCLUSIONS: Rhythm perception is impaired after stroke, and temporal gait asymmetry relates to impairments in producing rhythmic movement. These results may have implications for the use of auditory rhythmic stimuli to cue motor responses poststroke. Future work will explore brain responses to rhythm processing poststroke.


Asunto(s)
Trastornos Neurológicos de la Marcha/fisiopatología , Periodicidad , Accidente Cerebrovascular/fisiopatología , Percepción del Tiempo/fisiología , Anciano , Análisis de Varianza , Estudios Transversales , Femenino , Pie/fisiopatología , Marcha/fisiología , Trastornos Neurológicos de la Marcha/etiología , Trastornos Neurológicos de la Marcha/psicología , Humanos , Pierna/fisiopatología , Modelos Lineales , Masculino , Persona de Mediana Edad , Estadísticas no Paramétricas , Accidente Cerebrovascular/complicaciones , Accidente Cerebrovascular/psicología
12.
Nature ; 465(7299): 775-8, 2010 Jun 10.
Artículo en Inglés | MEDLINE | ID: mdl-20407435

RESUMEN

'Brain training', or the goal of improved cognitive function through the regular use of computerized tests, is a multimillion-pound industry, yet in our view scientific evidence to support its efficacy is lacking. Modest effects have been reported in some studies of older individuals and preschool children, and video-game players outperform non-players on some tests of visual attention. However, the widely held belief that commercially available computerized brain-training programs improve general cognitive function in the wider population in our opinion lacks empirical support. The central question is not whether performance on cognitive tests can be improved by training, but rather, whether those benefits transfer to other untrained tasks or lead to any general improvement in the level of cognitive functioning. Here we report the results of a six-week online study in which 11,430 participants trained several times each week on cognitive tasks designed to improve reasoning, memory, planning, visuospatial skills and attention. Although improvements were observed in every one of the cognitive tasks that were trained, no evidence was found for transfer effects to untrained tasks, even when those tasks were cognitively closely related.


Asunto(s)
Encéfalo/fisiología , Cognición/fisiología , Ejercicio Físico/fisiología , Atención/fisiología , Computadores , Humanos , Memoria/fisiología , Análisis y Desempeño de Tareas , Pensamiento/fisiología , Factores de Tiempo
13.
Cereb Cortex ; 23(4): 913-21, 2013 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-22499797

RESUMEN

Perception of temporal patterns is critical for speech, movement, and music. In the auditory domain, perception of a regular pulse, or beat, within a sequence of temporal intervals is associated with basal ganglia activity. Two alternative accounts of this striatal activity are possible: "searching" for temporal regularity in early stimulus processing stages or "prediction' of the timing of future tones after the beat is found (relying on continuation of an internally generated beat). To resolve between these accounts, we used functional magnetic resonance imaging (fMRI) to investigate different stages of beat perception. Participants heard a series of beat and nonbeat (irregular) monotone sequences. For each sequence, the preceding sequence provided a temporal beat context for the following sequence. Beat sequences were preceded by nonbeat sequences, requiring the beat to be found anew ("beat finding" condition), or by beat sequences with the same beat rate ("beat continuation"), or a different rate ("beat adjustment"). Detection of regularity is highest during beat finding, whereas generation and prediction are highest during beat continuation. We found the greatest striatal activity for beat continuation, less for beat adjustment, and the least for beat finding. Thus, the basal ganglia's response profile suggests a role in beat prediction, not in beat finding.


Asunto(s)
Percepción Auditiva/fisiología , Mapeo Encefálico , Cuerpo Estriado/fisiología , Emociones , Música , Detección de Señal Psicológica/fisiología , Estimulación Acústica , Adulto , Cuerpo Estriado/irrigación sanguínea , Femenino , Lateralidad Funcional , Humanos , Procesamiento de Imagen Asistido por Computador , Imagen por Resonancia Magnética , Masculino , Oxígeno , Periodicidad , Valor Predictivo de las Pruebas , Adulto Joven
14.
Adv Exp Med Biol ; 829: 325-38, 2014.
Artículo en Inglés | MEDLINE | ID: mdl-25358718

RESUMEN

The capacity to synchronize movements to the beat in music is a complex, and apparently uniquely human characteristic. Synchronizing movements to the beat requires beat perception, which entails prediction of future beats in rhythmic sequences of temporal intervals. Absolute timing mechanisms, where patterns of temporal intervals are encoded as a series of absolute durations, cannot fully explain beat perception. Beat perception seems better accounted for by relative timing mechanisms, where temporal intervals of a pattern are coded relative to a periodic beat interval. Evidence from behavioral, neuroimaging, brain stimulation and neuronal cell recording studies suggests a functional dissociation between the neural substrates of absolute and relative timing. This chapter reviews current findings on relative timing in the context of rhythm and beat perception.


Asunto(s)
Percepción Auditiva/fisiología , Imagen por Resonancia Magnética/métodos , Actividad Motora/fisiología , Percepción del Tiempo/fisiología , Humanos , Tiempo
15.
Parkinsons Dis ; 2024: 3447009, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38235044

RESUMEN

Background: Freezing of gait (FOG) is an intractable motor symptom in Parkinson's disease (PD) that increases fall risk and impairs the quality of life. FOG has been associated with anxiety, with experimental support for the notion that anxiety itself provokes FOG. We investigated the effect of acute anxiety reduction via alprazolam on FOG in PD. Methods: In ten patients with PD, FOG, and normal cognition, we administered 0.25 mg alprazolam in one session and placebo in another, in counterbalanced order. At each session, on separate days, patients walked on a pressure-sensitive walkway. Using Oculus Rift virtual-reality goggles, patients walked along a plank that appeared to be (a) level with the floor, in the low-anxiety condition or (b) raised high above the ground, in the high-anxiety conditions. In this way, we assessed the impacts of anxiety and alprazolam (i.e., anxiety reduction) on FOG frequency and other gait parameters. Results: FOG events appeared only in the high-anxiety conditions. Alprazolam significantly reduced subjective and objective measures of anxiety, as well as the prevalence of FOG (p = 0.05). Furthermore, alprazolam improved swing time (p < 0.05) and gait variability in all conditions, particularly during the elevated plank trials. Interpretation. Our results suggest that (1) anxiety induces FOG, and (2) alprazolam concomitantly reduces anxiety and FOG. Alprazolam further improved gait stability (i.e., swing time and gait variability). These findings reveal that anxiety triggers FOG in PD. Treating anxiety can reduce FOG and improve gait stability, potentially offering new therapeutic avenues for this intractable and disabling symptom in PD.

16.
Atten Percept Psychophys ; 86(4): 1400-1416, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38557941

RESUMEN

Music training is associated with better beat processing in the auditory modality. However, it is unknown how rhythmic training that emphasizes visual rhythms, such as dance training, might affect beat processing, nor whether training effects in general are modality specific. Here we examined how music and dance training interacted with modality during audiovisual integration and synchronization to auditory and visual isochronous sequences. In two experiments, musicians, dancers, and controls completed an audiovisual integration task and an audiovisual target-distractor synchronization task using dynamic visual stimuli (a bouncing figure). The groups performed similarly on the audiovisual integration tasks (Experiments 1 and 2). However, in the finger-tapping synchronization task (Experiment 1), musicians were more influenced by auditory distractors when synchronizing to visual sequences, while dancers were more influenced by visual distractors when synchronizing to auditory sequences. When participants synchronized with whole-body movements instead of finger-tapping (Experiment 2), all groups were more influenced by the visual distractor than the auditory distractor. Taken together, this study highlights how training is associated with audiovisual processing, and how different types of visual rhythmic stimuli and different movements alter beat perception and production outcome measures. Implications for the modality appropriateness hypothesis are discussed.


Asunto(s)
Atención , Baile , Música , Desempeño Psicomotor , Humanos , Baile/psicología , Femenino , Masculino , Adulto Joven , Atención/fisiología , Desempeño Psicomotor/fisiología , Adulto , Percepción Auditiva/fisiología , Percepción del Tiempo , Práctica Psicológica , Reconocimiento Visual de Modelos/fisiología , Adolescente , Percepción Visual/fisiología , Tiempo de Reacción
17.
Nat Hum Behav ; 8(5): 846-877, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38438653

RESUMEN

Music is present in every known society but varies from place to place. What, if anything, is universal to music cognition? We measured a signature of mental representations of rhythm in 39 participant groups in 15 countries, spanning urban societies and Indigenous populations. Listeners reproduced random 'seed' rhythms; their reproductions were fed back as the stimulus (as in the game of 'telephone'), such that their biases (the prior) could be estimated from the distribution of reproductions. Every tested group showed a sparse prior with peaks at integer-ratio rhythms. However, the importance of different integer ratios varied across groups, often reflecting local musical practices. Our results suggest a common feature of music cognition: discrete rhythm 'categories' at small-integer ratios. These discrete representations plausibly stabilize musical systems in the face of cultural transmission but interact with culture-specific traditions to yield the diversity that is evident when mental representations are probed across many cultures.


Asunto(s)
Percepción Auditiva , Comparación Transcultural , Música , Música/psicología , Humanos , Masculino , Adulto , Femenino , Percepción Auditiva/fisiología , Adulto Joven , Cognición/fisiología
18.
J Music Ther ; 60(1): 36-63, 2023 May 05.
Artículo en Inglés | MEDLINE | ID: mdl-36610070

RESUMEN

The purpose of this article was to report on the findings of the note frequency and velocity measures during Improvised Active Music Therapy (IAMT) sessions with individuals with Parkinson's disease (PD). In this single-subject multiple baseline design across subjects, the article reports the note frequency (note count) and velocity of movement (mean note velocity) played by three right-handed participants while playing uninterrupted improvised music on a simplified electronic drum-set. During baseline, the music therapist played rhythmic accompaniment on guitar using a low-moderate density of syncopation. During treatment, the Music Therapist introduced rhythms with a moderate-high density of syncopation. The music content of the sessions was transformed into digital music using a musical instrument digital interface. Results of this study indicated that all participants exhibited an increase in note count during baseline until reaching a plateau at treatment condition and were found to be significantly positively correlated with the Music Therapist's note count. All participants played more notes with upper extremity (UE) across conditions than with lower extremity. All participants also scored similar total mean velocity across conditions. Two participants demonstrated higher mean note velocity with UE than right foot, whereas the other participant did not demonstrate this difference. Two participants also exhibited greater mean note velocity variability with left foot within and across conditions. More research is required to identify commonalities in note count and mean note velocity measures in individuals with PD during IAMT sessions.


Asunto(s)
Musicoterapia , Música , Enfermedad de Parkinson , Humanos , Musicoterapia/métodos , Enfermedad de Parkinson/terapia , Estimulación Acústica , Movimiento
19.
J Exp Psychol Hum Percept Perform ; 49(1): 108-128, 2023 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-36265031

RESUMEN

Humans perceive ratios of spatial and temporal magnitudes, such as length and duration. Previous studies have shown that spatial ratios may be processed by a common ratio processing system. The aim of the current study was to determine whether ratio processing is a domain-general ability and consequently involves common processing of temporal and spatial magnitudes. Two hundred seventy-five participants completed a battery of spatial and temporal ratio estimation and magnitude discrimination tasks online. Structural equation modeling was used to analyze the relationship between ratio processing across domains while controlling for absolute magnitude discrimination ability. The four-factor higher order model, consisting of spatial and temporal magnitude and ratio processing latent variables, showed adequate local and global fit, χ²(44) = 41.41, p = .626, root mean square error of approximation = .000. We found a significant relationship (r = .63) between spatial and temporal ratio processing, suggesting that ratio processing may be a domain-general ability. Additionally, absolute magnitude processing explained a large part (60-66%) of the variance in both spatial and temporal ratio processing factors. Overall, findings suggest that representation of spatial and temporal ratios is highly related and points toward a common ratio processing mechanism across different types of magnitudes. (PsycInfo Database Record (c) 2023 APA, all rights reserved).


Asunto(s)
Percepción del Tiempo , Humanos , Análisis de Clases Latentes
20.
Cortex ; 167: 51-64, 2023 10.
Artículo en Inglés | MEDLINE | ID: mdl-37523965

RESUMEN

We investigated how repeated exposure to a stimulus affects intersubject synchrony in the brains of young and older adults. We used functional magnetic resonance imaging (fMRI) to measure brain responses to familiar and novel stimuli. Young adults participated in a familiarization paradigm designed to mimic 'natural' exposure while older adults were presented with stimuli they had known for more than 50 years. Intersubject synchrony was calculated to detect common stimulus-driven brain activity across young and older adults as they listened to the novel and familiar stimuli. Contrary to our hypotheses, synchrony was not related to the amount of stimulus exposure; both young and older adults showed more synchrony to novel than to familiar stimuli regardless of whether the stimuli had been heard once, known for a few weeks, or known for more than 50 years. In young adults these synchrony differences were found across the brain in the bilateral temporal lobes, and in the frontal orbital cortex. In older adults the synchrony differences were found only in the bilateral temporal lobes. This reduction may be related to an increase in idiosyncratic responses after exposure to a stimulus but does not seem to be related to how well the stimuli are learned or to differences in attention. Until the effects of repeated exposure on synchrony are fully understood, future studies using intersubject synchrony, where the novelty of the stimuli cannot be guaranteed, may consider exposing all of their participants to the stimuli once before data are collected to mitigate the effects of any systematic differences in stimulus exposure.


Asunto(s)
Mapeo Encefálico , Imagen por Resonancia Magnética , Adulto Joven , Humanos , Anciano , Mapeo Encefálico/métodos , Imagen por Resonancia Magnética/métodos , Encéfalo/diagnóstico por imagen , Encéfalo/fisiología , Lóbulo Temporal , Aprendizaje
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA