RESUMEN
The linear mixed-effects model (LME) is a versatile approach to account for dependence among observations. Many large-scale neuroimaging datasets with complex designs have increased the need for LME; however LME has seldom been used in whole-brain imaging analyses due to its heavy computational requirements. In this paper, we introduce a fast and efficient mixed-effects algorithm (FEMA) that makes whole-brain vertex-wise, voxel-wise, and connectome-wide LME analyses in large samples possible. We validate FEMA with extensive simulations, showing that the estimates of the fixed effects are equivalent to standard maximum likelihood estimates but obtained with orders of magnitude improvement in computational speed. We demonstrate the applicability of FEMA by studying the cross-sectional and longitudinal effects of age on region-of-interest level and vertex-wise cortical thickness, as well as connectome-wide functional connectivity values derived from resting state functional MRI, using longitudinal imaging data from the Adolescent Brain Cognitive DevelopmentSM Study release 4.0. Our analyses reveal distinct spatial patterns for the annualized changes in vertex-wise cortical thickness and connectome-wide connectivity values in early adolescence, highlighting a critical time of brain maturation. The simulations and application to real data show that FEMA enables advanced investigation of the relationships between large numbers of neuroimaging metrics and variables of interest while considering complex study designs, including repeated measures and family structures, in a fast and efficient manner. The source code for FEMA is available via: https://github.com/cmig-research-group/cmig_tools/.
Asunto(s)
Conectoma , Imagen por Resonancia Magnética , Adolescente , Humanos , Imagen por Resonancia Magnética/métodos , Estudios Transversales , Encéfalo/diagnóstico por imagen , Neuroimagen/métodos , Conectoma/métodos , AlgoritmosRESUMEN
Predicting and organizing patterns of events is important for humans to survive in a dynamically changing world. The motor system has been proposed to be actively, and necessarily, engaged in not only the production but the perception of rhythm by organizing hierarchical timing that influences auditory responses. It is not yet well understood how the motor system interacts with the auditory system to perceive and maintain hierarchical structure in time. This study investigated the dynamic interaction between auditory and motor functional sources during the perception and imagination of musical meters. We pursued this using a novel method combining high-density EEG, EMG, and motion capture with independent component analysis to separate motor and auditory activity during meter imagery while robustly controlling against covert movement. We demonstrated that endogenous brain activity in both auditory and motor functional sources reflects the imagination of binary and ternary meters in the absence of corresponding acoustic cues or overt movement at the meter rate. We found clear evidence for hypothesized motor-to-auditory information flow at the beat rate in all conditions, suggesting a role for top-down influence of the motor system on auditory processing of beat-based rhythms, and reflecting an auditory-motor system with tight reciprocal informational coupling. These findings align with and further extend a set of motor hypotheses from beat perception to hierarchical meter imagination, adding supporting evidence to active engagement of the motor system in auditory processing, which may more broadly speak to the neural mechanisms of temporal processing in other human cognitive functions.SIGNIFICANCE STATEMENT Humans live in a world full of hierarchically structured temporal information, the accurate perception of which is essential for understanding speech and music. Music provides a window into the brain mechanisms of time perception, enabling us to examine how the brain groups musical beats into, for example a march or waltz. Using a novel paradigm combining measurement of electrical brain activity with data-driven analysis, this study directly investigates motor-auditory connectivity during meter imagination. Findings highlight the importance of the motor system in the active imagination of meter. This study sheds new light on a fundamental form of perception by demonstrating how auditory-motor interaction may support hierarchical timing processing, which may have clinical implications for speech and motor rehabilitation.
Asunto(s)
Percepción Auditiva/fisiología , Encéfalo/fisiología , Imaginación/fisiología , Música/psicología , Percepción del Tiempo/fisiología , Estimulación Acústica , Electroencefalografía , Electromiografía , Femenino , Humanos , Masculino , Músculo Esquelético/fisiología , Periodicidad , Adulto JovenRESUMEN
Music engagement is a powerful, influential experience that often begins early in life. Music engagement is moderately heritable in adults (~ 41-69%), but fewer studies have examined genetic influences on childhood music engagement, including their association with language and executive functions. Here we explored genetic and environmental influences on music listening and instrument playing (including singing) in the baseline assessment of the Adolescent Brain Cognitive Development study. Parents reported on their 9-10-year-old children's music experiences (N = 11,876 children; N = 1543 from twin pairs). Both music measures were explained primarily by shared environmental influences. Instrument exposure (but not frequency of instrument engagement) was associated with language skills (r = .27) and executive functions (r = .15-0.17), and these associations with instrument engagement were stronger than those for music listening, visual art, or soccer engagement. These findings highlight the role of shared environmental influences between early music experiences, language, and executive function, during a formative time in development.
Asunto(s)
Función Ejecutiva , Música , Adolescente , Adulto , Niño , Humanos , Encéfalo , Cognición , Lenguaje , Música/psicologíaRESUMEN
Brain systems supporting body movement are active during music listening in the absence of overt movement. This covert motor activity is not well understood, but some theories propose a role in auditory timing prediction facilitated by motor simulation. One question is how music-related covert motor activity relates to motor activity during overt movement. We address this question using scalp electroencephalogram by measuring mu rhythms-cortical field phenomena associated with the somatomotor system that appear over sensorimotor cortex. Lateralized mu enhancement over hand sensorimotor cortex during/just before foot movement in foot versus hand movement paradigms is thought to reflect hand movement inhibition during current/prospective movement of another effector. Behavior of mu during music listening with movement suppressed has yet to be determined. We recorded 32-channel EEG (n = 17) during silence without movement, overt movement (foot/hand), and music listening without movement. Using an independent component analysis-based source equivalent dipole clustering technique, we identified three mu-related clusters, localized to left primary motor and right and midline premotor cortices. Right foot tapping was accompanied by mu enhancement in the left lateral source cluster, replicating previous work. Music listening was accompanied by similar mu enhancement in the left, as well as midline, clusters. We are the first, to our knowledge, to report, and also to source-resolve, music-related mu modulation in the absence of overt movements. Covert music-related motor activity has been shown to play a role in beat perception (Ross JM, Iversen JR, Balasubramaniam R. Neurocase 22: 558-565, 2016). Our current results show enhancement in somatotopically organized mu, supporting overt motor inhibition during beat perception.NEW & NOTEWORTHY We are the first to report music-related mu enhancement in the absence of overt movements and the first to source-resolve mu activity during music listening. We suggest that music-related mu modulation reflects overt motor inhibition during passive music listening. This work is relevant for the development of theories relating to the involvement of covert motor system activity for predictive beat perception.
Asunto(s)
Percepción Auditiva/fisiología , Ondas Encefálicas/fisiología , Electroencefalografía , Actividad Motora/fisiología , Corteza Motora/fisiología , Música , Adulto , Proteínas de Drosophila , Femenino , Pie/fisiología , Mano/fisiología , Humanos , Masculino , Ubiquitina-Proteína Ligasas , Adulto JovenRESUMEN
We investigated Bayesian modelling of human whole-body motion capture data recorded during an exploratory real-space navigation task in an "Audiomaze" environment (see the companion paper by Miyakoshi et al. in the same volume) to study the effect of map learning on navigation behaviour. There were three models, a feedback-only model (no map learning), a map resetting model (single-trial limited map learning), and a map updating model (map learning accumulated across three trials). The estimated behavioural variables included step sizes and turning angles. Results showed that the estimated step sizes were constantly more accurate using the map learning models than the feedback-only model. The same effect was confirmed for turning angle estimates, but only for data from the third trial. We interpreted these results as Bayesian evidence of human map learning on navigation behaviour. Furthermore, separating the participants into groups of egocentric and allocentric navigators revealed an advantage for the map updating model in estimating step sizes, but only for the allocentric navigators. This interaction indicated that the allocentric navigators may take more advantage of map learning than do egocentric navigators. We discuss relationships of these results to simultaneous localization and mapping (SLAM) problem.
Asunto(s)
Realidad Aumentada , Navegación Espacial , Teorema de Bayes , Humanos , Aprendizaje , Percepción EspacialRESUMEN
There is growing interest in how the brain's motor systems contribute to the perception of musical rhythms. The Action Simulation for Auditory Prediction hypothesis proposes that the dorsal auditory stream is involved in bidirectional interchange between auditory perception and beat-based prediction in motor planning structures via parietal cortex [Patel, A. D., & Iversen, J. R. The evolutionary neuroscience of musical beat perception: The Action Simulation for Auditory Prediction (ASAP) hypothesis. Frontiers in Systems Neuroscience, 8, 57, 2014]. We used a TMS protocol, continuous theta burst stimulation (cTBS), that is known to down-regulate cortical activity for up to 60 min following stimulation to test for causal contributions to beat-based timing perception. cTBS target areas included the left posterior parietal cortex (lPPC), which is part of the dorsal auditory stream, and the left SMA (lSMA). We hypothesized that down-regulating lPPC would interfere with accurate beat-based perception by disrupting the dorsal auditory stream. We hypothesized that we would induce no interference to absolute timing ability. We predicted that down-regulating lSMA, which is not part of the dorsal auditory stream but has been implicated in internally timed movements, would also interfere with accurate beat-based timing perception. We show ( n = 25) that cTBS down-regulation of lPPC does interfere with beat-based timing ability, but only the ability to detect shifts in beat phase, not changes in tempo. Down-regulation of lSMA, in contrast, did not interfere with beat-based timing. As expected, absolute interval timing ability was not impacted by the down-regulation of lPPC or lSMA. These results support that the dorsal auditory stream plays an essential role in accurate phase perception in beat-based timing. We find no evidence of an essential role of parietal cortex or SMA in interval timing.
Asunto(s)
Percepción Auditiva/fisiología , Corteza Motora/fisiología , Lóbulo Parietal/fisiología , Percepción del Tiempo/fisiología , Estimulación Acústica , Adolescente , Adulto , Discriminación en Psicología , Femenino , Humanos , Masculino , Música , Vías Nerviosas/fisiología , Psicoacústica , Estimulación Magnética Transcraneal , Adulto JovenRESUMEN
There is growing interest in whether the motor system plays an essential role in rhythm perception. The motor system is active during the perception of rhythms, but is such motor activity merely a sign of unexecuted motor planning, or does it play a causal role in shaping the perception of rhythm? We present evidence for a causal role of motor planning and simulation, and review theories of internal simulation for beat-based timing prediction. Brain stimulation studies have the potential to conclusively test if the motor system plays a causal role in beat perception and ground theories to their neural underpinnings.
Asunto(s)
Percepción Auditiva/fisiología , Modelos Psicológicos , Actividad Motora/fisiología , Música , Estimulación Acústica , Humanos , Estimulación Transcraneal de Corriente DirectaRESUMEN
Synchronization of finger taps with periodically flashing visual stimuli is known to be much more variable than synchronization with an auditory metronome. When one of these rhythms is the synchronization target and the other serves as a distracter at various temporal offsets, strong auditory dominance is observed. However, it has recently been shown that visuomotor synchronization improves substantially with moving stimuli such as a continuously bouncing ball. The present study pitted a bouncing ball against an auditory metronome in a target-distracter synchronization paradigm, with the participants being auditory experts (musicians) and visual experts (video gamers and ball players). Synchronization was still less variable with auditory than with visual target stimuli in both groups. For musicians, auditory stimuli tended to be more distracting than visual stimuli, whereas the opposite was the case for the visual experts. Overall, there was no main effect of distracter modality. Thus, a distracting spatiotemporal visual rhythm can be as effective as a distracting auditory rhythm in its capacity to perturb synchronous movement, but its effectiveness also depends on modality-specific expertise.
Asunto(s)
Atención/fisiología , Percepción Auditiva/fisiología , Desempeño Psicomotor/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Adolescente , Adulto , Femenino , Humanos , Masculino , Estimulación Luminosa , Adulto JovenRESUMEN
Based on the idea that neural entrainment establishes regular attentional fluctuations that facilitate hierarchical processing in both music and language, we hypothesized that individual differences in syntactic (grammatical) skills will be partly explained by patterns of neural responses to musical rhythm. To test this hypothesis, we recorded neural activity using electroencephalography (EEG) while children (N = 25) listened passively to rhythmic patterns that induced different beat percepts. Analysis of evoked beta and gamma activity revealed that individual differences in the magnitude of neural responses to rhythm explained variance in six-year-olds' expressive grammar abilities, beyond and complementarily to their performance in a behavioral rhythm perception task. These results reinforce the idea that mechanisms of neural beat entrainment may be a shared neural resource supporting hierarchical processing across music and language and suggest a relevant marker of the relationship between rhythm processing and grammar abilities in elementary-school-age children, previously observed only behaviorally.
Asunto(s)
Individualidad , Música , Humanos , Niño , Percepción Auditiva/fisiología , Lingüística , Electroencefalografía , LenguajeRESUMEN
During late childhood behavioral changes, such as increased risk-taking and emotional reactivity, have been associated with the maturation of cortico-cortico and cortico-subcortical circuits. Understanding microstructural changes in both white matter and subcortical regions may aid our understanding of how individual differences in these behaviors emerge. Restriction spectrum imaging (RSI) is a framework for modelling diffusion-weighted imaging that decomposes the diffusion signal from a voxel into hindered, restricted, and free compartments. This yields greater specificity than conventional methods of characterizing diffusion. Using RSI, we quantified voxelwise restricted diffusion across the brain and measured age associations in a large sample (n = 8086) from the Adolescent Brain and Cognitive Development (ABCD) study aged 9-14 years. Older participants showed a higher restricted signal fraction across the brain, with the largest associations in subcortical regions, particularly the basal ganglia and ventral diencephalon. Importantly, age associations varied with respect to the cytoarchitecture within white matter fiber tracts and subcortical structures, for example age associations differed across thalamic nuclei. This suggests that age-related changes may map onto specific cell populations or circuits and highlights the utility of voxelwise compared to ROI-wise analyses. Future analyses will aim to understand the relevance of this microstructural developmental for behavioral outcomes.
Asunto(s)
Sustancia Blanca , Adolescente , Encéfalo , Niño , Imagen de Difusión por Resonancia Magnética , Humanos , IndividualidadRESUMEN
Is engaging with music good for your mental health? This question has long been the topic of empirical clinical and nonclinical investigations, with studies indicating positive associations between music engagement and quality of life, reduced depression or anxiety symptoms, and less frequent substance use. However, many earlier investigations were limited by small populations and methodological limitations, and it has also been suggested that aspects of music engagement may even be associated with worse mental health outcomes. The purpose of this scoping review is first to summarize the existing state of music engagement and mental health studies, identifying their strengths and weaknesses. We focus on broad domains of mental health diagnoses including internalizing psychopathology (e.g., depression and anxiety symptoms and diagnoses), externalizing psychopathology (e.g., substance use), and thought disorders (e.g., schizophrenia). Second, we propose a theoretical model to inform future work that describes the importance of simultaneously considering music-mental health associations at the levels of (1) correlated genetic and/or environmental influences vs. (bi)directional associations, (2) interactions with genetic risk factors, (3) treatment efficacy, and (4) mediation through brain structure and function. Finally, we describe how recent advances in large-scale data collection, including genetic, neuroimaging, and electronic health record studies, allow for a more rigorous examination of these associations that can also elucidate their neurobiological substrates.
Asunto(s)
Trastornos Mentales , Musicoterapia , Música , Ansiedad , Humanos , Trastornos Mentales/terapia , Salud Mental , Calidad de VidaRESUMEN
The ability to integrate our perceptions across sensory modalities and across time, to execute and coordinate movements, and to adapt to a changing environment rests on temporal processing. Timing is essential for basic daily tasks, such as walking, social interaction, speech and language comprehension, and attention. Impaired temporal processing may contribute to various disorders, from attention-deficit hyperactivity disorder and schizophrenia to Parkinson's disease and dementia. The foundational importance of timing ability has yet to be fully understood; and popular tasks used to investigate behavioral timing ability, such as sensorimotor synchronization (SMS), engage a variety of processes in addition to the neural processing of time. The present study utilizes SMS in conjunction with a separate passive listening task that manipulates temporal expectancy while recording electroencephalographic data. Participants display a larger N1-P2 evoked potential complex to unexpected beats relative to temporally predictable beats, a differential we call the timing response index (TRI). The TRI correlates with performance on the SMS task: better synchronizers show a larger brain response to unexpected beats. The TRI, derived from the perceptually driven N1-P2 complex, disentangles the perceptual and motor components inherent in SMS and thus may serve as a neural marker of a more general temporal processing.
RESUMEN
Spontaneous movement to music occurs in every human culture and is a foundation of dance [1]. This response to music is absent in most species (including monkeys), yet it occurs in parrots, perhaps because they (like humans, and unlike monkeys) are vocal learners whose brains contain strong auditory-motor connections, conferring sophisticated audiomotor processing abilities [2,3]. Previous research has shown that parrots can bob their heads or lift their feet in synchrony with a musical beat [2,3], but humans move to music using a wide variety of movements and body parts. Is this also true of parrots? If so, it would constrain theories of how movement to music is controlled by parrot brains. Specifically, as head bobbing is part of parrot courtship displays [4] and foot lifting is part of locomotion, these may be innate movements controlled by central pattern generators which become entrained by auditory rhythms, without the involvement of complex motor planning. This would be unlike humans, where movement to music engages cortical networks including frontal and parietal areas [5]. Rich diversity in parrot movement to music would suggest a strong contribution of forebrain regions to this behavior, perhaps including motor learning regions abutting the complex vocal-learning 'shell' regions that are unique to parrots among vocal learning birds [6]. Here we report that a sulphur-crested cockatoo (Cacatua galerita eleonora) responds to music with remarkably diverse spontaneous movements employing a variety of body parts, and suggest why parrots share this response with humans.
Asunto(s)
Percepción Auditiva , Cacatúas , Movimiento , Música/psicología , Animales , BaileRESUMEN
Growing evidence points to a link between musical abilities and certain phonetic and prosodic skills in language. However, the mechanisms that underlie these relations are not well understood. A recent study by Wong et al. suggests that musical training sharpens the subcortical encoding of linguistic pitch patterns. We consider the implications of their methods and findings for establishing a link between musical training and phonetic abilities more generally.
Asunto(s)
Aprendizaje por Asociación/fisiología , Percepción Auditiva/fisiología , Potenciales Evocados Auditivos/fisiología , Lenguaje , Música , Humanos , Fonética , Percepción del Habla/fisiologíaRESUMEN
Many aspects of perception are known to be shaped by experience, but others are thought to be innate universal properties of the brain. A specific example comes from rhythm perception, where one of the fundamental perceptual operations is the grouping of successive events into higher-level patterns, an operation critical to the perception of language and music. Grouping has long been thought to be governed by innate perceptual principles established a century ago. The current work demonstrates instead that grouping can be strongly dependent on culture. Native English and Japanese speakers were tested for their perception of grouping of simple rhythmic sequences of tones. Members of the two cultures showed different patterns of perceptual grouping, demonstrating that these basic auditory processes are not universal but are shaped by experience. It is suggested that the observed perceptual differences reflect the rhythms of the two languages, and that native language can exert an influence on general auditory perception at a basic level.
Asunto(s)
Comparación Transcultural , Lenguaje , Periodicidad , Discriminación de la Altura Tonal , Detección de Señal Psicológica , Estimulación Acústica , Adolescente , Adulto , California , Señales (Psicología) , Humanos , Japón , Multilingüismo , Adulto JovenRESUMEN
Quantification of dynamic causal interactions among brain regions constitutes an important component of conducting research and developing applications in experimental and translational neuroscience. Furthermore, cortical networks with dynamic causal connectivity in brain-computer interface (BCI) applications offer a more comprehensive view of brain states implicated in behavior than do individual brain regions. However, models of cortical network dynamics are difficult to generalize across subjects because current electroencephalography (EEG) signal analysis techniques are limited in their ability to reliably localize sources across subjects. We propose an algorithmic and computational framework for identifying cortical networks across subjects in which dynamic causal connectivity is modeled among user-selected cortical regions of interest (ROIs). We demonstrate the strength of the proposed framework using a "reach/saccade to spatial target" cognitive task performed by 10 right-handed individuals. Modeling of causal cortical interactions was accomplished through measurement of cortical activity using (EEG), application of independent component clustering to identify cortical ROIs as network nodes, estimation of cortical current density using cortically constrained low resolution electromagnetic brain tomography (cLORETA), multivariate autoregressive (MVAR) modeling of representative cortical activity signals from each ROI, and quantification of the dynamic causal interaction among the identified ROIs using the Short-time direct Directed Transfer function (SdDTF). The resulting cortical network and the computed causal dynamics among its nodes exhibited physiologically plausible behavior, consistent with past results reported in the literature. This physiological plausibility of the results strengthens the framework's applicability in reliably capturing complex brain functionality, which is required by applications, such as diagnostics and BCI.
RESUMEN
A striking asymmetry in human sensorimotor processing is that humans synchronize movements to rhythmic sound with far greater precision than to temporally equivalent visual stimuli (e.g., to an auditory vs. a flashing visual metronome). Traditionally, this finding is thought to reflect a fundamental difference in auditory vs. visual processing, i.e., superior temporal processing by the auditory system and/or privileged coupling between the auditory and motor systems. It is unclear whether this asymmetry is an inevitable consequence of brain organization or whether it can be modified (or even eliminated) by stimulus characteristics or by experience. With respect to stimulus characteristics, we found that a moving, colliding visual stimulus (a silent image of a bouncing ball with a distinct collision point on the floor) was able to drive synchronization nearly as accurately as sound in hearing participants. To study the role of experience, we compared synchronization to flashing metronomes in hearing and profoundly deaf individuals. Deaf individuals performed better than hearing individuals when synchronizing with visual flashes, suggesting that cross-modal plasticity enhances the ability to synchronize with temporally discrete visual stimuli. Furthermore, when deaf (but not hearing) individuals synchronized with the bouncing ball, their tapping patterns suggest that visual timing may access higher-order beat perception mechanisms for deaf individuals. These results indicate that the auditory advantage in rhythmic synchronization is more experience- and stimulus-dependent than has been previously reported.
Asunto(s)
Percepción Auditiva/fisiología , Sordera/fisiopatología , Desempeño Psicomotor/fisiología , Percepción del Tiempo/fisiología , Percepción Visual/fisiología , Adulto , Femenino , Humanos , MasculinoRESUMEN
EVERY HUMAN CULTURE HAS SOME FORM OF MUSIC WITH A BEAT: a perceived periodic pulse that structures the perception of musical rhythm and which serves as a framework for synchronized movement to music. What are the neural mechanisms of musical beat perception, and how did they evolve? One view, which dates back to Darwin and implicitly informs some current models of beat perception, is that the relevant neural mechanisms are relatively general and are widespread among animal species. On the basis of recent neural and cross-species data on musical beat processing, this paper argues for a different view. Here we argue that beat perception is a complex brain function involving temporally-precise communication between auditory regions and motor planning regions of the cortex (even in the absence of overt movement). More specifically, we propose that simulation of periodic movement in motor planning regions provides a neural signal that helps the auditory system predict the timing of upcoming beats. This "action simulation for auditory prediction" (ASAP) hypothesis leads to testable predictions. We further suggest that ASAP relies on dorsal auditory pathway connections between auditory regions and motor planning regions via the parietal cortex, and suggest that these connections may be stronger in humans than in non-human primates due to the evolution of vocal learning in our lineage. This suggestion motivates cross-species research to determine which species are capable of human-like beat perception, i.e., beat perception that involves accurate temporal prediction of beat times across a fairly broad range of tempi.
RESUMEN
The planning of goal-directed movement towards targets in different parts of space is an important function of the brain. Such visuo-motor planning and execution is known to involve multiple brain regions, including visual, parietal, and frontal cortices. To understand how these brain regions work together to both plan and execute goal-directed movement, it is essential to describe the dynamic causal interactions among them. Here we model causal interactions of distributed cortical source activity derived from non-invasively recorded EEG, using a combination of ICA, minimum-norm distributed source localization (cLORETA), and dynamical modeling within the Source Information Flow Toolbox (SIFT). We differentiate network causal connectivity of reach planning and execution, by comparing the causal network in a speeded reaching task with that for a control task not requiring goal-directed movement. Analysis of a pilot dataset (n=5) shows the utility of this technique and reveals increased connectivity between visual, motor and frontal brain regions during reach planning, together with decreased cross-hemisphere visual coupling during planning and execution, possibly related to task demands.