Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Mais filtros








Base de dados
Intervalo de ano de publicação
1.
Artigo em Inglês | MEDLINE | ID: mdl-37917526

RESUMO

The concept of augmented reality (AR) assistants has captured the human imagination for decades, becoming a staple of modern science fiction. To pursue this goal, it is necessary to develop artificial intelligence (AI)-based methods that simultaneously perceive the 3D environment, reason about physical tasks, and model the performer, all in real-time. Within this framework, a wide variety of sensors are needed to generate data across different modalities, such as audio, video, depth, speech, and time-of-flight. The required sensors are typically part of the AR headset, providing performer sensing and interaction through visual, audio, and haptic feedback. AI assistants not only record the performer as they perform activities, but also require machine learning (ML) models to understand and assist the performer as they interact with the physical world. Therefore, developing such assistants is a challenging task. We propose ARGUS, a visual analytics system to support the development of intelligent AR assistants. Our system was designed as part of a multi-year-long collaboration between visualization researchers and ML and AR experts. This co-design process has led to advances in the visualization of ML in AR. Our system allows for online visualization of object, action, and step detection as well as offline analysis of previously recorded AR sessions. It visualizes not only the multimodal sensor data streams but also the output of the ML models. This allows developers to gain insights into the performer activities as well as the ML models, helping them troubleshoot, improve, and fine-tune the components of the AR assistant.

2.
PLoS Comput Biol ; 19(6): e1011154, 2023 06.
Artigo em Inglês | MEDLINE | ID: mdl-37285380

RESUMO

A musician's spontaneous rate of movement, called spontaneous motor tempo (SMT), can be measured while spontaneously playing a simple melody. Data shows that the SMT influences the musician's tempo and synchronization. In this study we present a model that captures these phenomena. We review the results from three previously-published studies: solo musical performance with a pacing metronome tempo that is different from the SMT, solo musical performance without a metronome at a tempo that is faster or slower than the SMT, and duet musical performance between musicians with matching or mismatching SMTs. These studies showed, respectively, that the asynchrony between the pacing metronome and the musician's tempo grew as a function of the difference between the metronome tempo and the musician's SMT, musicians drifted away from the initial tempo toward the SMT, and the absolute asynchronies were smaller if musicians had matching SMTs. We hypothesize that the SMT constantly acts as a pulling force affecting musical actions at a tempo different from a musician's SMT. To test our hypothesis, we developed a model consisting of a non-linear oscillator with Hebbian tempo learning and a pulling force to the model's spontaneous frequency. While the model's spontaneous frequency emulates the SMT, elastic Hebbian learning allows for frequency learning to match a stimulus' frequency. To test our hypothesis, we first fit model parameters to match the data in the first of the three studies and asked whether this same model would explain the data the remaining two studies without further tuning. Results showed that the model's dynamics allowed it to explain all three experiments with the same set of parameters. Our theory offers a dynamical-systems explanation of how an individual's SMT affects synchronization in realistic music performance settings, and the model also enables predictions about performance settings not yet tested.


Assuntos
Música , Elasticidade , Aprendizagem , Movimento , Humanos
3.
Front Comput Neurosci ; 17: 1151895, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37265781

RESUMO

Rhythmicity permeates large parts of human experience. Humans generate various motor and brain rhythms spanning a range of frequencies. We also experience and synchronize to externally imposed rhythmicity, for example from music and song or from the 24-h light-dark cycles of the sun. In the context of music, humans have the ability to perceive, generate, and anticipate rhythmic structures, for example, "the beat." Experimental and behavioral studies offer clues about the biophysical and neural mechanisms that underlie our rhythmic abilities, and about different brain areas that are involved but many open questions remain. In this paper, we review several theoretical and computational approaches, each centered at different levels of description, that address specific aspects of musical rhythmic generation, perception, attention, perception-action coordination, and learning. We survey methods and results from applications of dynamical systems theory, neuro-mechanistic modeling, and Bayesian inference. Some frameworks rely on synchronization of intrinsic brain rhythms that span the relevant frequency range; some formulations involve real-time adaptation schemes for error-correction to align the phase and frequency of a dedicated circuit; others involve learning and dynamically adjusting expectations to make rhythm tracking predictions. Each of the approaches, while initially designed to answer specific questions, offers the possibility of being integrated into a larger framework that provides insights into our ability to perceive and generate rhythmic patterns.

4.
PLoS Comput Biol ; 15(10): e1007371, 2019 10.
Artigo em Inglês | MEDLINE | ID: mdl-31671096

RESUMO

Dancing and playing music require people to coordinate actions with auditory rhythms. In laboratory perception-action coordination tasks, people are asked to synchronize taps with a metronome. When synchronizing with a metronome, people tend to anticipate stimulus onsets, tapping slightly before the stimulus. The anticipation tendency increases with longer stimulus periods of up to 3500ms, but is less pronounced in trained individuals like musicians compared to non-musicians. Furthermore, external factors influence the timing of tapping. These factors include the presence of auditory feedback from one's own taps, the presence of a partner performing coordinated joint tapping, and transmission latencies (TLs) between coordinating partners. Phenomena like the anticipation tendency can be explained by delay-coupled systems, which may be inherent to the sensorimotor system during perception-action coordination. Here we tested whether a dynamical systems model based on this hypothesis reproduces observed patterns of human synchronization. We simulated behavior with a model consisting of an oscillator receiving its own delayed activity as input. Three simulation experiments were conducted using previously-published behavioral data from 1) simple tapping, 2) two-person alternating beat-tapping, and 3) two-person alternating rhythm-clapping in the presence of a range of constant auditory TLs. In Experiment 1, our model replicated the larger anticipation observed for longer stimulus intervals and adjusting the amplitude of the delayed feedback reproduced the difference between musicians and non-musicians. In Experiment 2, by connecting two models we replicated the smaller anticipation observed in human joint tapping with bi-directional auditory feedback compared to joint tapping without feedback. In Experiment 3, we varied TLs between two models alternately receiving signals from one another. Results showed reciprocal lags at points of alternation, consistent with behavioral patterns. Overall, our model explains various anticipatory behaviors, and has potential to inform theories of adaptive human synchronization.


Assuntos
Estimulação Acústica/métodos , Percepção Auditiva/fisiologia , Percepção do Tempo/fisiologia , Ciclos de Atividade , Antecipação Psicológica/fisiologia , Ciências Biocomportamentais , Simulação por Computador , Retroalimentação , Retroalimentação Sensorial/fisiologia , Humanos , Música , Periodicidade , Desempenho Psicomotor
5.
Front Neurosci ; 13: 1088, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31680824

RESUMO

Recent work in interpersonal coordination has revealed that neural oscillations, occurring spontaneously in the human brain, are modulated during the sensory, motor, and cognitive processes involved in interpersonal interactions. In particular, alpha-band (8-12 Hz) activity, linked to attention in general, is related to coordination dynamics and empathy traits. Researchers have also identified an association between each individual's attentiveness to their co-actor and the relative similarity in the co-actors' roles, influencing their behavioral synchronization patterns. We employed music ensemble performance to evaluate patterns of behavioral and neural activity when roles between co-performers are systematically varied with complete counterbalancing. Specifically, we designed a piano duet task, with three types of co-actor dissimilarity, or asymmetry: (1) musical role (starting vs. joining), (2) musical task similarity (similar vs. dissimilar melodic parts), and (3) performer animacy (human-to-human vs. human-to-non-adaptive computer). We examined how the experience of these asymmetries in four initial musical phrases, alternatingly played by the co-performers, influenced the pianists' performance of a subsequent unison phrase. Electroencephalography was recorded simultaneously from both performers while playing keyboards. We evaluated note-onset timing and alpha modulation around the unison phrase. We also investigated whether each individual's self-reported empathy was related to behavioral and neural activity. Our findings revealed closer behavioral synchronization when pianists played with a human vs. computer partner, likely because the computer was non-adaptive. When performers played with a human partner, or a joining performer played with a computer partner, having a similar vs. dissimilar musical part did not have a significant effect on their alpha modulation immediately prior to unison. However, when starting performers played with a computer partner with a dissimilar vs. similar part there was significantly greater alpha synchronization. In other words, starting players attended less to the computer partner playing a similar accompaniment, operating in a solo-like mode. Moreover, this alpha difference based on melodic similarity was related to a difference in note-onset adaptivity, which was in turn correlated with performer trait empathy. Collectively our results extend previous findings by showing that musical ensemble performance gives rise to a socialized context whose lasting effects encompass attentiveness, perceptual-motor coordination, and empathy.

6.
Soc Neurosci ; 14(4): 449-461, 2019 08.
Artigo em Inglês | MEDLINE | ID: mdl-29938589

RESUMO

During joint action tasks, expectations for outcomes of one's own and other's actions are collectively monitored. Recent evidence suggests that trait empathy levels may also influence performance monitoring processes. The present study investigated how outcome expectation and empathy interact during a turn-taking piano duet task, using simultaneous electroencephalography (EEG) recording. During the performances, one note in each player's part was altered in pitch to elicit the feedback-related negativity (FRN) and subsequent P3 complex. Pianists memorized and performed pieces containing either a similar or dissimilar sequence as their partner. For additional blocks, pianists also played both sequence types with an audio-only computer partner. The FRN and P3a were larger in response to self than other, while P3b occurred only in response to self, suggesting greater online monitoring of self- compared to other-produced actions during turn-taking joint action. P3a was larger when pianists played a similar sequence as their partner. Finally, as trait empathy level increased, FRN in response to self decreased. This association was absent for FRN in response to other. This may reflect that highly-empathetic musicians during joint performance could use a strategy to suppress exclusive focus on self-monitoring.


Assuntos
Estimulação Acústica/métodos , Comportamento Cooperativo , Eletroencefalografia/métodos , Potenciais Evocados P300/fisiologia , Música/psicologia , Desempenho Psicomotor/fisiologia , Adulto , Feminino , Humanos , Masculino , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA