Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 65
Filtrar
1.
Front Psychol ; 15: 1327992, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38515976

RESUMEN

In this perspective paper, we explore the use of haptic feedback to enhance human-human interaction during musical tasks. We start by providing an overview of the theoretical foundation that underpins our approach, which is rooted in the embodied music cognition framework, and by briefly presenting the concepts of action-perception loop, sensorimotor coupling and entrainment. Thereafter, we focus on the role of haptic information in music playing and we discuss the use of wearable technologies, namely lightweight exoskeletons, for the exchange of haptic information between humans. We present two experimental scenarios in which the effectiveness of this technology for enhancing musical interaction and learning might be validated. Finally, we briefly discuss some of the theoretical and pedagogical implications of the use of technologies for haptic communication in musical contexts, while also addressing the potential barriers to the widespread adoption of exoskeletons in such contexts.

3.
Sensors (Basel) ; 23(23)2023 Dec 03.
Artículo en Inglés | MEDLINE | ID: mdl-38067961

RESUMEN

Within the broader context of improving interactions between artificial intelligence and humans, the question has arisen regarding whether auditory and rhythmic support could increase attention for visual stimuli that do not stand out clearly from an information stream. To this end, we designed an experiment inspired by pip-and-pop but more appropriate for eliciting attention and P3a-event-related potentials (ERPs). In this study, the aim was to distinguish between targets and distractors based on the subject's electroencephalography (EEG) data. We achieved this objective by employing different machine learning (ML) methods for both individual-subject (IS) and cross-subject (CS) models. Finally, we investigated which EEG channels and time points were used by the model to make its predictions using saliency maps. We were able to successfully perform the aforementioned classification task for both the IS and CS scenarios, reaching classification accuracies up to 76%. In accordance with the literature, the model primarily used the parietal-occipital electrodes between 200 ms and 300 ms after the stimulus to make its prediction. The findings from this research contribute to the development of more effective P300-based brain-computer interfaces. Furthermore, they validate the EEG data collected in our experiment.


Asunto(s)
Inteligencia Artificial , Electroencefalografía , Humanos , Estimulación Acústica , Atención , Potenciales Relacionados con Evento P300 , Potenciales Evocados
4.
iScience ; 26(11): 108099, 2023 Nov 17.
Artículo en Inglés | MEDLINE | ID: mdl-37920667

RESUMEN

Humans exhibit a strong tendency to synchronize movements with each other, with visual perspective potentially influencing interpersonal synchronization. By manipulating the visual scenes of participants engaged in a joint finger-tapping task, we examined the effects of 1st person and 2nd person visual perspectives on their coordination dynamics. We hypothesized that perceiving the partner's movements from their 1st person perspective would enhance spontaneous interpersonal synchronization, potentially mediated by the embodiment of the partner's hand. We observed significant differences in attractor dynamics across visual perspectives. Specifically, participants in 1st person coupling were unable to maintain de-coupled trajectories as effectively as in 2nd person coupling. Our findings suggest that visual perspective influences coordination dynamics in dyadic interactions, engaging error-correction mechanisms in individual brains as they integrate the partner's hand into their body representation. Our results have the potential to inform the development of applications for motor training and rehabilitation.

5.
Sci Rep ; 13(1): 21064, 2023 11 29.
Artículo en Inglés | MEDLINE | ID: mdl-38030693

RESUMEN

Sensorimotor synchronization strategies have been frequently used for gait rehabilitation in different neurological populations. Despite these positive effects on gait, attentional processes required to dynamically attend to the auditory stimuli needs elaboration. Here, we investigate auditory attention in neurological populations compared to healthy controls quantified by EEG recordings. Literature was systematically searched in databases PubMed and Web of Science. Inclusion criteria were investigation of auditory attention quantified by EEG recordings in neurological populations in cross-sectional studies. In total, 35 studies were included, including participants with Parkinson's disease (PD), stroke, Traumatic Brain Injury (TBI), Multiple Sclerosis (MS), Amyotrophic Lateral Sclerosis (ALS). A meta-analysis was performed on P3 amplitude and latency separately to look at the differences between neurological populations and healthy controls in terms of P3 amplitude and latency. Overall, neurological populations showed impairments in auditory processing in terms of magnitude and delay compared to healthy controls. Consideration of individual auditory processes and thereafter selecting and/or designing the auditory structure during sensorimotor synchronization paradigms in neurological physical rehabilitation is recommended.


Asunto(s)
Atención , Enfermedad de Parkinson , Humanos , Estudios Transversales , Marcha , Electroencefalografía
6.
Comput Human Behav ; 146: 107810, 2023 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-37663430

RESUMEN

The acquisition of advanced gestures is a challenge in various domains of proficient sensorimotor performance. For example, orchestral violinists must move in sync with the lead violinist's gestures. To help train these gestures, an educational music play-back system was developed using a HoloLens 2 simulated AR environment and an avatar representation of the lead violinist. This study aimed to investigate the impact of using a 2D or 3D representation of the lead violinist's avatar on students' learning experience in the AR environment. To assess the learning outcome, the study employed a longitudinal experiment design, in which eleven participants practiced two pieces of music in four trials, evenly spaced over a month. Participants were asked to mimic the avatar's gestures as closely as possible when it came to using the bow, including bowing, articulations, and dynamics. The study compared the similarities between the avatar's gestures and those of the participants at the biomechanical level, using motion capture measurements, as well as the smoothness of the participants' movements. Additionally, presence and perceived difficulty were assessed using questionnaires. The results suggest that using a 3D representation of the avatar leads to better gesture resemblance and a higher experience of presence compared to a 2D representation. The 2D representation, however, showed a learning effect, but this was not observed in the 3D condition. The findings suggest that the 3D condition benefits from stereoscopic information that enhances spatial cognition, making it more effective in relation to sensorimotor performance. Overall, the 3D condition had a greater impact on performance than on learning. This work concludes with recommendations for future efforts directed towards AR-based advanced gesture training to address the challenges related to measurement methodology and participants' feedback on the AR application.

7.
Neuroimage ; 277: 120226, 2023 08 15.
Artículo en Inglés | MEDLINE | ID: mdl-37321359

RESUMEN

Neural entrainment, defined as unidirectional synchronization of neural oscillations to an external rhythmic stimulus, is a topic of major interest in the field of neuroscience. Despite broad scientific consensus on its existence, on its pivotal role in sensory and motor processes, and on its fundamental definition, empirical research struggles in quantifying it with non-invasive electrophysiology. To this date, broadly adopted state-of-the-art methods still fail to capture the dynamic underlying the phenomenon. Here, we present event-related frequency adjustment (ERFA) as a methodological framework to induce and to measure neural entrainment in human participants, optimized for multivariate EEG datasets. By applying dynamic phase and tempo perturbations to isochronous auditory metronomes during a finger-tapping task, we analyzed adaptive changes in instantaneous frequency of entrained oscillatory components during error correction. Spatial filter design allowed us to untangle, from the multivariate EEG signal, perceptual and sensorimotor oscillatory components attuned to the stimulation frequency. Both components dynamically adjusted their frequency in response to perturbations, tracking the stimulus dynamics by slowing down and speeding up the oscillation over time. Source separation revealed that sensorimotor processing enhanced the entrained response, supporting the notion that the active engagement of the motor system plays a critical role in processing rhythmic stimuli. In the case of phase shift, motor engagement was a necessary condition to observe any response, whereas sustained tempo changes induced frequency adjustment even in the perceptual oscillatory component. Although the magnitude of the perturbations was controlled across positive and negative direction, we observed a general bias in the frequency adjustments towards positive changes, which points at the effect of intrinsic dynamics constraining neural entrainment. We conclude that our findings provide compelling evidence for neural entrainment as mechanism underlying overt sensorimotor synchronization, and highlight that our methodology offers a paradigm and a measure for quantifying its oscillatory dynamics by means of non-invasive electrophysiology, rigorously informed by the fundamental definition of entrainment.


Asunto(s)
Electroencefalografía , Periodicidad , Humanos , Estimulación Acústica/métodos
8.
NPJ Sci Learn ; 8(1): 22, 2023 Jun 27.
Artículo en Inglés | MEDLINE | ID: mdl-37369691

RESUMEN

Music performance requires high levels of motor control. Professional musicians use body movements not only to accomplish and help technical efficiency, but to shape expressive interpretation. Here, we recorded motion and audio data of twenty participants performing four musical fragments varying in the degree of technical difficulty to analyze how knee flexion is employed by expert saxophone players. Using a computational model of the auditory periphery, we extracted emergent acoustical properties of sound to inference critical cognitive patterns of music processing and relate them to motion data. Results showed that knee flexion is causally linked to tone expectations and correlated to rhythmical density, suggesting that this gesture is associated with expressive and facilitative purposes. Furthermore, when instructed to play immobile, participants tended to microflex (>1 Hz) more frequently compared to when playing expressively, possibly indicating a natural urge to move to the music. These results underline the robustness of body movement in musical performance, providing valuable insights for the understanding of communicative processes, and development of motor learning cues.

9.
Eur J Neurosci ; 2023 Apr 28.
Artículo en Inglés | MEDLINE | ID: mdl-37118877

RESUMEN

Pupil size covaries with the diffusion rate of the cholinergic and noradrenergic neurons throughout the brain, which are essential to arousal. Recent findings suggest that slow pupil fluctuations during locomotion are an index of sustained activity in cholinergic axons, whereas phasic dilations are related to the activity of noradrenergic axons. Here, we investigated movement induced arousal (i.e., by singing and swaying to music), hypothesising that actively engaging in musical behaviour will provoke stronger emotional engagement in participants and lead to different qualitative patterns of tonic and phasic pupil activity. A challenge in the analysis of pupil data is the turbulent behaviour of pupil diameter due to exogenous ocular activity commonly encountered during motor tasks and the high variability typically found between individuals. To address this, we developed an algorithm that adaptively estimates and removes pupil responses to ocular events, as well as a functional data methodology, derived from Pfaffs' generalised arousal, that provides a new statistical dimension on how pupil data can be interpreted according to putative neuromodulatory signalling. We found that actively engaging in singing enhanced slow cholinergic-related pupil dilations and having the opportunity to move your body while performing amplified the effect of singing on pupil activity. Phasic pupil oscillations during motor execution attenuated in time, which is often interpreted as a measure of sense of agency over movement.

10.
PLoS One ; 18(4): e0284387, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37071622

RESUMEN

Several studies have addressed motor coordination in dance, but few have addressed the influence of musical context on micro-timing during sensorimotor synchronization (SMS) in classical ballet. In this study, we analyze the Promenade in Arabesque of the Odile variations, first as a dance-music fragment non-embedded in a musical context, then as a dance-music fragment embedded in a musical context at two different instances. Given the musical structure of the fragments, there are repeats of patterns between and within the fragments. Four dancers were invited to perform the three fragments in twelve successive performances. The beats of the music were extracted and compared with the timing of the dancers' heel movements, using circular-linear smooth regression modelling, and circular statistics. The results reveal an effect of repeat within fragments, and an effect of musical context between fragments, on micro-timing anticipation in SMS. The methodology offers a framework for future work on dynamical aspects of SMS.


Asunto(s)
Baile , Música , Movimiento
11.
Neuroimage ; 257: 119326, 2022 08 15.
Artículo en Inglés | MEDLINE | ID: mdl-35667334

RESUMEN

Across a broad spectrum of interactions, humans exhibit a prominent tendency to synchronize their movements with one another. Traditionally, this phenomenon has been explained from the perspectives of predictive coding or dynamical systems theory. While these theories diverge with respect to whether individuals hold internal models of each other, they both assume a predictive or anticipatory mechanism enabling rhythmic interactions. However, the neural bases underpinning interpersonal synchronization are still a subject under active investigation. Here we provide evidence that the brain relies on a common oscillatory mechanism to pace self-generated rhythmic movements and to track the movements produced by a partner. By performing dual-electroencephalography recordings during a joint finger-tapping task, we identified an oscillatory component in the beta range (∼ 20 Hz), which was significantly modulated by both self-generated and other-generated movement. In conditions where the partners perceived each other, we observed periodic fluctuations of beta power as a function of the reciprocal movement cycles. Crucially, this modulation occurred both in visually and in auditorily coupled conditions, and was accompanied by recurrent periods of dyadic synchronized behavior. Our results show that periodic beta power modulations may be a critical mechanism underlying interpersonal synchronization, possibly enabling mutual predictions between coupled individuals, leading to co-regulation of timing and overt mutual adaptation. Our findings thus provide a potential bridge between influential theories attempting to explain interpersonal coordination, and a concrete connection to its neurophysiological bases.


Asunto(s)
Electroencefalografía , Movimiento , Encéfalo/fisiología , Mapeo Encefálico , Humanos , Movimiento/fisiología
12.
Front Psychol ; 13: 894366, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35756201

RESUMEN

In a century where humans and machines-powered by artificial intelligence or not-increasingly work together, it is of interest to understand human processing of multi-sensory stimuli in relation to attention and working memory. This paper explores whether and when supporting visual information with rhythmic auditory stimuli can optimize multi-sensory information processing. In turn, this can make the interaction between humans or between machines and humans more engaging, rewarding and activating. For this purpose a novel working memory paradigm was developed where participants are presented with a series of five target digits randomly interchanged with five distractor digits. Their goal is to remember the target digits and recall them orally. Depending on the condition support is provided by audio and/or rhythm. It is expected that the sound will lead to a better performance. It is also expected that this effect of sound is different in case of rhythmic and non-rhythmic sound. Last but not least, some variability is expected across participants. To make correct conclusions, the data of the experiment was statistically analyzed in a classic way, but also predictive models were developed in order to predict outcomes based on a range of input variables related to the experiment and the participant. The effect of auditory support could be confirmed, but no difference was observed between rhythmic and non-rhythmic sounds. Overall performance was indeed affected by individual differences, such as visual dominance or perceived task difficulty. Surprisingly a music education did not significantly affect the performance and even tended toward a negative effect. To better understand the underlying processes of attention, also brain activation data, e.g., by means of electroencephalography (EEG), should be recorded. This approach can be subject to a future work.

13.
Ann N Y Acad Sci ; 1513(1): 153-169, 2022 07.
Artículo en Inglés | MEDLINE | ID: mdl-35437776

RESUMEN

Given the prevalence of motor and cognitive functions in persons with multiple sclerosis (PwMS), we proposed that the theoretical framework of embodiment could provide a rehabilitation avenue to train these functions as one functional unit. PwMS (n = 31) and age- and gender-matched healthy controls (n = 30) underwent an embodied learning protocol. This involved learning a cognitive sequence while performing it through bodily stepping movement under three feedback conditions (melody, sound, and visual). Cognitive and movement performance was assessed by a delayed recall 15 min after undergoing the embodied learning protocol. Half of participants correctly recalled the sequence in all three conditions, while 70% of healthy controls achieved correct recall within the melody condition. Balance impairment predicted the speed of executing the sequence irrespective of learning, most apparent in the melody condition. Information processing speed predicted the speed of executing the sequence in the melody and sound conditions between participants as well as over time. Those who learned performed the sequence faster in the melody condition only and overall were faster over time. We propose how embodied learning could expand the current context of rehabilitation of cognitive and motor control in PwMS.


Asunto(s)
Esclerosis Múltiple , Cognición , Retroalimentación Sensorial , Humanos , Aprendizaje , Movimiento
14.
J Sports Sci ; 40(7): 808-820, 2022 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-35172692

RESUMEN

This study assessed the centre of pressure (COP) behaviour and the relationship with impact severity during heel-toe running in conventional athletic footwear. We hypothesized that the COP behaviour depends on its location at foot strike, which would be associated with the vertical loading rate and peak tibial accelerations in heel-toe running. Ground reaction force and tibial acceleration were measured in 104 distance runners running level at ~3.2 m/s. High-speed plantar pressure captured at high temporal resolution (500 Hz) and spatial resolution (7.62 · 5.08 mm/sensor) allowed for localization of the COP directly in the footprint during running in self-selected athletic footwear. More lateral X-coordinates of the COP at first foot contact had, in general, more anterior Y-coordinates (adj.R2:0.609). In heel-toe running, a more anterior foot strike had a greater refined strike index, which was associated with a quicker roll-over in the rearfoot zone. This strike index contributed to greater maximum vertical loading rates (R2:0.121), and greater axial (R2:0.047) and resultant (R2:0.247) peak tibial accelerations. These findings indicate that (1) the COP progression is dependend on the COP location at foot strike; (2) more anterior rearfoot strikes are more likely to have greater impact severity than posterior rearfoot strikes.


Asunto(s)
Talón , Carrera , Fenómenos Biomecánicos , Pie , Marcha , Humanos , Dedos del Pie
15.
Scand J Med Sci Sports ; 32(4): 698-709, 2022 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-34982842

RESUMEN

BACKGROUND: Running retraining with the use of biofeedback on an impact measure has been executed or evaluated in the biomechanics laboratory. Here, the execution and evaluation of feedback-driven retraining are taken out of the laboratory. PURPOSE: To determine whether biofeedback can reduce the peak tibial acceleration with or without affecting the running cadence in a 3-week retraining protocol. STUDY DESIGN: Quasi-randomized controlled trial. METHODS: Twenty runners with high peak tibial acceleration were allocated to either the retraining (n = 10, 32.1 ± 7.8 years, 10.9 ± 2.8 g) or control (n = 10, 39.1 ± 10.4 years, 13.0 ± 3.9 g) groups. They performed six running sessions in an athletic training environment. A body-worn system collected axial tibial acceleration and provided real-time feedback. The retraining group received music-based biofeedback in a faded feedback scheme. Pink noise was superimposed on tempo-synchronized music when the peak tibial acceleration was ≥70% of the runner's baseline. The control group received tempo-synchronized music, which acted as a placebo for blinding purposes. Speed feedback was provided to obtain a stable running speed of ~2.9 m·s-1 . Peak tibial acceleration and running cadence were evaluated. RESULTS: A significant group-by-feedback interaction effect was detected for peak tibial acceleration. The experimental group had a decrease in peak tibial acceleration by 25.5% (mean: 10.9 ± 2.8 g versus 8.1 ± 3.9 g, p = 0.008, d = 1.08, mean difference = 2.77 [0.94, 4.61]) without changing the running cadence. The control group had no statistically significant change in peak tibial acceleration nor in running cadence. CONCLUSION: The retraining protocol was effective at reducing the peak tibial acceleration in high-impact runners by reacting to music-based biofeedback that was provided in real time per wearable technology in a training environment. This reduction magnitude may have meaningful influences on injury risk.


Asunto(s)
Música , Aceleración , Biorretroalimentación Psicológica , Fenómenos Biomecánicos , Marcha , Humanos , Tibia
16.
Mult Scler ; 28(3): 492-495, 2022 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-34726562

RESUMEN

BACKGROUND: In a case report of a progressive multiple sclerosis with cerebellar impairments, we reported that synchronisation of steps to beats was possible only at -12% of usual walking cadence during 1 minute of walking. OBJECTIVES AND METHODS: Here, we investigate the effect of synchronisation using two different alignment approaches on the patient's gait pattern over 2 minutes of walking, compared to walking in silence. RESULTS AND CONCLUSION: This proof of concept showed that the adaptive approach was successful resulting in an improved gait pattern compared to the other conditions, providing preliminary evidence to support a full-scale intervention study.


Asunto(s)
Ataxia Cerebelosa , Esclerosis Múltiple , Música , Ataxia Cerebelosa/etiología , Marcha , Humanos , Caminata
17.
Sci Rep ; 11(1): 18355, 2021 09 15.
Artículo en Inglés | MEDLINE | ID: mdl-34526522

RESUMEN

Rhythmic joint coordination is ubiquitous in daily-life human activities. In order to coordinate their actions towards shared goals, individuals need to co-regulate their timing and move together at the collective level of behavior. Remarkably, basic forms of coordinated behavior tend to emerge spontaneously as long as two individuals are exposed to each other's rhythmic movements. The present study investigated the dynamics of spontaneous dyadic entrainment, and more specifically how they depend on the sensory modalities mediating informational coupling. By means of a novel interactive paradigm, we showed that dyadic entrainment systematically takes place during a minimalistic rhythmic task despite explicit instructions to ignore the partner. Crucially, the interaction was organized by clear dynamics in a modality-dependent fashion. Our results showed highly consistent coordination patterns in visually-mediated entrainment, whereas we observed more chaotic and more variable profiles in the auditorily-mediated counterpart. The proposed experimental paradigm yields empirical evidence for the overwhelming tendency of dyads to behave as coupled rhythmic units. In the context of our experimental design, it showed that coordination dynamics differ according to availability and nature of perceptual information. Interventions aimed at rehabilitating, teaching or training sensorimotor functions can be ultimately informed and optimized by such fundamental knowledge.

18.
Front Neurosci ; 15: 667838, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34335155

RESUMEN

Life and social sciences often focus on the social nature of music (and language alike). In biology, for example, the three main evolutionary hypotheses about music (i.e., sexual selection, parent-infant bond, and group cohesion) stress its intrinsically social character (Honing et al., 2015). Neurobiology thereby has investigated the neuronal and hormonal underpinnings of musicality for more than two decades (Chanda and Levitin, 2013; Salimpoor et al., 2015; Mehr et al., 2019). In line with these approaches, the present paper aims to suggest that the proper way to capture the social interactive nature of music (and, before it, musicality), is to conceive of it as an embodied language, rooted in culturally adapted brain structures (Clarke et al., 2015; D'Ausilio et al., 2015). This proposal heeds Ian Cross' call for an investigation of music as an "interactive communicative process" rather than "a manifestation of patterns in sound" (Cross, 2014), with an emphasis on its embodied and predictive (coding) aspects (Clark, 2016; Leman, 2016; Koelsch et al., 2019). In the present paper our goal is: (i) to propose a framework of music as embodied language based on a review of the major concepts that define joint musical action, with a particular emphasis on embodied music cognition and predictive processing, along with some relevant neural underpinnings; (ii) to summarize three experiments conducted in our laboratories (and recently published), which provide evidence for, and can be interpreted according to, the new conceptual framework. In doing so, we draw on both cognitive musicology and neuroscience to outline a comprehensive framework of musical interaction, exploring several aspects of making music in dyads, from a very basic proto-musical action, like tapping, to more sophisticated contexts, like playing a jazz standard and singing a hocket melody. Our framework combines embodied and predictive features, revolving around the concept of joint agency (Pacherie, 2012; Keller et al., 2016; Bolt and Loehr, 2017). If social interaction is the "default mode" by which human brains communicate with their environment (Hari et al., 2015), music and musicality conceived of as an embodied language may arguably provide a route toward its navigation.

19.
Front Hum Neurosci ; 15: 668918, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34177492

RESUMEN

Understanding rhythmic behavior in the context of coupled auditory and motor systems has been of interest to neurological rehabilitation, in particular, to facilitate walking. Recent work based on behavioral measures revealed an entrainment effect of auditory rhythms on motor rhythms. In this study, we propose a method to compute the neural component of such a process from an electroencephalographic (EEG) signal. A simple auditory-motor synchronization paradigm was used, where 28 healthy participants were instructed to synchronize their finger-tapping with a metronome. The computation of the neural outcome measure was carried out in two blocks. In the first block, we used Generalized Eigendecomposition (GED) to reduce the data dimensionality to the component which maximally entrained to the metronome frequency. The scalp topography pointed at brain activity over contralateral sensorimotor regions. In the second block, we computed instantaneous frequency from the analytic signal of the extracted component. This returned a time-varying measure of frequency fluctuations, whose standard deviation provided our "stability index" as a neural outcome measure of auditory-motor coupling. Finally, the proposed neural measure was validated by conducting a correlation analysis with a set of behavioral outcomes from the synchronization task: resultant vector length, relative phase angle, mean asynchrony, and tempo matching. Significant moderate negative correlations were found with the first three measures, suggesting that the stability index provided a quantifiable neural outcome measure of entrainment, with selectivity towards phase-correction mechanisms. We address further adoption of the proposed approach, especially with populations where sensorimotor abilities are compromised by an underlying pathological condition. The impact of using stability index can potentially be used as an outcome measure to assess rehabilitation protocols, and possibly provide further insight into neuropathological models of auditory-motor coupling.

20.
Front Psychol ; 12: 647929, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34108911

RESUMEN

Musical life became disrupted in 2020 due to the COVID-19 pandemic. Many musicians and venues turned to online alternatives, such as livestreaming. In this study, three livestreamed concerts were organized to examine separate, yet interconnected concepts-agency, presence, and social context-to ascertain which components of livestreamed concerts facilitate social connectedness. Hierarchical Bayesian modeling was conducted on 83 complete responses to examine the effects of the manipulations on feelings of social connectedness with the artist and the audience. Results showed that in concert 1, where half of the participants were allowed to vote for the final song to be played, this option did not result in the experience of more agency. Instead, if their preferred song was played (regardless of voting ability) participants experienced greater connectedness to the artist. In concert 2, participants who attended the concert with virtual reality headsets experienced greater feelings of physical presence, as well as greater feelings of connectedness with the artist, than those that viewed a normal YouTube livestream. In concert 3, attendance through Zoom led to greater experience of social presence, but predicted less connectedness with the artist, compared to a normal YouTube livestream. Crucially, a greater negative impact of COVID-19 (e.g., loneliness) predicted feelings of connectedness with the artist, possibly because participants fulfilled their social needs with this parasocial interaction. Examining data from all concerts suggested that physical presence was a predictor of connectedness with both the artist and the audience, while social presence only predicted connectedness with the audience. Correlational analyses revealed that reductions in loneliness and isolation were associated with feelings of shared agency, physical and social presence, and connectedness to the audience. Overall, the findings suggest that in order to reduce feelings of loneliness and increase connectedness, concert organizers and musicians could tune elements of their livestreams to facilitate feelings of physical and social presence.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA