Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Behav Res Methods ; 2023 Aug 22.
Artigo em Inglês | MEDLINE | ID: mdl-37608235

RESUMO

Eye tracking is prevalent in scientific and commercial applications. Recent computer vision and deep learning methods enable eye tracking with off-the-shelf webcams and reduce dependence on expensive, restrictive hardware. However, such deep learning methods have not yet been applied and evaluated for remote, online psychological experiments. In this study, we tackle critical challenges faced in remote eye tracking setups and systematically evaluate appearance-based deep learning methods of gaze tracking and blink detection. From their own homes and laptops, 65 participants performed a battery of eye tracking tasks including (i) fixation, (ii) zone classification, (iii) free viewing, (iv) smooth pursuit, and (v) blink detection. Webcam recordings of the participants performing these tasks were processed offline through appearance-based models of gaze and blink detection. The task battery required different eye movements that characterized gaze and blink prediction accuracy over a comprehensive list of measures. We find the best gaze accuracy to be 2.4° and precision of 0.47°, which outperforms previous online eye tracking studies and reduces the gap between laboratory-based and online eye tracking performance. We release the experiment template, recorded data, and analysis code with the motivation to escalate affordable, accessible, and scalable eye tracking that has the potential to accelerate research in the fields of psychological science, cognitive neuroscience, user experience design, and human-computer interfaces.

2.
Cognition ; 239: 105537, 2023 10.
Artigo em Inglês | MEDLINE | ID: mdl-37487303

RESUMO

Compared to audio only (AO) conditions, audiovisual (AV) information can enhance the aesthetic experience of a music performance. However, such beneficial multimodal effects have yet to be studied in naturalistic music performance settings. Further, peripheral physiological correlates of aesthetic experiences are not well-understood. Here, participants were invited to a concert hall for piano performances of Bach, Messiaen, and Beethoven, which were presented in two conditions: AV and AO. They rated their aesthetic experience (AE) after each piece (Experiment 1 and 2), while peripheral signals (cardiorespiratory measures, skin conductance, and facial muscle activity) were continuously measured (Experiment 2). Factor scores of AE were significantly higher in the AV condition in both experiments. LF/HF ratio, a heart rhythm that represents activation of the sympathetic nervous system, was higher in the AO condition, suggesting increased arousal, likely caused by less predictable sound onsets in the AO condition. We present partial evidence that breathing was faster and facial muscle activity was higher in the AV condition, suggesting that observing a performer's movements likely enhances motor mimicry in these more voluntary peripheral measures. Further, zygomaticus ('smiling') muscle activity was a significant predictor of AE. Thus, we suggest physiological measures are related to AE, but at different levels: the more involuntary measures (i.e., heart rhythms) may reflect more sensory aspects, while the more voluntary measures (i.e., muscular control of breathing and facial responses) may reflect the liking aspect of an AE. In summary, we replicate and extend previous findings that AV information enhances AE in a naturalistic music performance setting. We further show that a combination of self-report and peripheral measures benefit a meaningful assessment of AE in naturalistic music performance settings.


Assuntos
Música , Humanos , Percepção Auditiva/fisiologia , Nível de Alerta/fisiologia , Sistema Nervoso Simpático , Movimento
3.
Psychophysiology ; 60(10): e14350, 2023 10.
Artigo em Inglês | MEDLINE | ID: mdl-37381918

RESUMO

Affective sciences often make use of self-reports to assess subjective states. Seeking a more implicit measure for states and emotions, our study explored spontaneous eye blinking during music listening. However, blinking is understudied in the context of research on subjective states. Therefore, a second goal was to explore different ways of analyzing blink activity recorded from infra-red eye trackers, using two additional data sets from earlier studies differing in blinking and viewing instructions. We first replicate the effect of increased blink rates during music listening in comparison with silence and show that the effect is not related to changes in self-reported valence, arousal, or to specific musical features. Interestingly, but in contrast, felt absorption reduced participants' blinking. The instruction to inhibit blinking did not change results. From a methodological perspective, we make suggestions about how to define blinks from data loss periods recorded by eye trackers and report a data-driven outlier rejection procedure and its efficiency for subject-mean analyses, as well as trial-based analyses. We ran a variety of mixed effects models that differed in how trials without blinking were treated. The main results largely converged across accounts. The broad consistency of results across different experiments, outlier treatments, and statistical models demonstrates the reliability of the reported effects. As recordings of data loss periods come for free when interested in eye movements or pupillometry, we encourage researchers to pay attention to blink activity and contribute to the further understanding of the relation between blinking, subjective states, and cognitive processing.


Assuntos
Piscadela , Música , Humanos , Reprodutibilidade dos Testes , Movimentos Oculares , Emoções
4.
Front Hum Neurosci ; 16: 916551, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35782041

RESUMO

Synchronization of movement enhances cooperation and trust between people. However, the degree to which individuals can synchronize with each other depends on their ability to perceive the timing of others' actions and produce movements accordingly. Here, we introduce an assistive device-a multi-person adaptive metronome-to facilitate synchronization abilities. The adaptive metronome is implemented on Arduino Uno circuit boards, allowing for negligible temporal latency between tapper input and adaptive sonic output. Across five experiments-two single-tapper, and three group (four tapper) experiments, we analyzed the effects of metronome adaptivity (percent correction based on the immediately preceding tap-metronome asynchrony) and auditory feedback on tapping performance and subjective ratings. In all experiments, tapper synchronization with the metronome was significantly enhanced with 25-50% adaptivity, compared to no adaptation. In group experiments with auditory feedback, synchrony remained enhanced even at 70-100% adaptivity; without feedback, synchrony at these high adaptivity levels returned to near baseline. Subjective ratings of being in the groove, in synchrony with the metronome, in synchrony with others, liking the task, and difficulty all reduced to one latent factor, which we termed enjoyment. This same factor structure replicated across all experiments. In predicting enjoyment, we found an interaction between auditory feedback and metronome adaptivity, with increased enjoyment at optimal levels of adaptivity only with auditory feedback and a severe decrease in enjoyment at higher levels of adaptivity, especially without feedback. Exploratory analyses relating person-level variables to tapping performance showed that musical sophistication and trait sadness contributed to the degree to which an individual differed in tapping stability from the group. Nonetheless, individuals and groups benefitted from adaptivity, regardless of their musical sophistication. Further, individuals who tapped less variably than the group (which only occurred ∼25% of the time) were more likely to feel "in the groove." Overall, this work replicates previous single person adaptive metronome studies and extends them to group contexts, thereby contributing to our understanding of the temporal, auditory, psychological, and personal factors underlying interpersonal synchrony and subjective enjoyment during sensorimotor interaction. Further, it provides an open-source tool for studying such factors in a controlled way.

5.
Sci Rep ; 11(1): 22457, 2021 11 17.
Artigo em Inglês | MEDLINE | ID: mdl-34789746

RESUMO

While there is an increasing shift in cognitive science to study perception of naturalistic stimuli, this study extends this goal to naturalistic contexts by assessing physiological synchrony across audience members in a concert setting. Cardiorespiratory, skin conductance, and facial muscle responses were measured from participants attending live string quintet performances of full-length works from Viennese Classical, Contemporary, and Romantic styles. The concert was repeated on three consecutive days with different audiences. Using inter-subject correlation (ISC) to identify reliable responses to music, we found that highly correlated responses depicted typical signatures of physiological arousal. By relating physiological ISC to quantitative values of music features, logistic regressions revealed that high physiological synchrony was consistently predicted by faster tempi (which had higher ratings of arousing emotions and engagement), but only in Classical and Romantic styles (rated as familiar) and not the Contemporary style (rated as unfamiliar). Additionally, highly synchronised responses across all three concert audiences occurred during important structural moments in the music-identified using music theoretical analysis-namely at transitional passages, boundaries, and phrase repetitions. Overall, our results show that specific music features induce similar physiological responses across audience members in a concert context, which are linked to arousal, engagement, and familiarity.


Assuntos
Nível de Alerta/fisiologia , Percepção Auditiva/fisiologia , Emoções/fisiologia , Músculos Faciais/fisiologia , Frequência Cardíaca/fisiologia , Música/psicologia , Reconhecimento Psicológico/fisiologia , Taxa Respiratória/fisiologia , Estimulação Acústica/métodos , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Feminino , Humanos , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Adulto Jovem
6.
J Eye Mov Res ; 11(2)2019 Feb 15.
Artigo em Inglês | MEDLINE | ID: mdl-33828684

RESUMO

Though eye-tracking is typically a methodology applied in the visual research domain, recent studies suggest its relevance in the context of music research. There exists a communityof researchers interested in this kind of research from varied disciplinary backgrounds scattered across the globe. Therefore, in August 2017, an international conference was held at the Max Planck Institute for Empirical Aesthetics in Frankfurt, Germany,to bring this research community together. The conference was dedicated to the topic of music and eye-tracking, asking the question: what do eye movements, pupil dilation, and blinking activity tell us about musical processing? This special issue is constituted of top-scoring research from the conference and spans a range of music-related topics. From tracking the gaze of performers in musical trios to basic research on how eye movements are affected by background music, the contents of this special issue highlight a variety of experimental approaches and possible applications of eye-tracking in music research.

7.
J Exp Psychol Hum Percept Perform ; 44(11): 1694-1711, 2018 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-30091636

RESUMO

Many environmental sounds, such as music or speech, are patterned in time. Dynamic attending theory, and supporting empirical evidence, suggests that a stimulus's temporal structure serves to orient attention to specific moments in time. One instantiation of this theory posits that attention synchronizes to the temporal structure of a stimulus in an oscillatory fashion, with optimal perception at salient time points or oscillation peaks. We examined whether a model consisting of damped linear oscillators succeeds at predicting temporal attention behavior in rhythmic multi-instrumental music. We conducted 3 experiments in which we mapped listeners' perceptual sensitivity by estimating detection thresholds for intensity deviants embedded at multiple time points within a stimulus pattern. We compared participants' thresholds for detecting intensity changes at various time points with the modeled salience prediction at each of those time points. Across all experiments, results showed that the resonator model predicted listener thresholds, such that listeners were more sensitive to probes at time points corresponding to greater model-predicted salience. This effect held for both intensity increment and decrement probes and for metrically simple and complex stimuli. Moreover, the resonator model explained the data better than did predictions based on canonical metric hierarchy or auditory scene density. Our results offer new insight into the temporal orienting of attention in complex auditory scenes using a parsimonious computational model for predicting attentional dynamics. (PsycINFO Database Record (c) 2018 APA, all rights reserved).


Assuntos
Atenção/fisiologia , Percepção Auditiva/fisiologia , Música , Adolescente , Adulto , Feminino , Humanos , Masculino , Modelos Teóricos , Fatores de Tempo , Adulto Jovem
8.
J Eye Mov Res ; 11(2)2018 Nov 20.
Artigo em Inglês | MEDLINE | ID: mdl-33828695

RESUMO

Rhythm is a ubiquitous feature of music that induces specific neural modes of processing. In this paper, we assess the potential of a stimulus-driven linear oscillator model (57) to predict dynamic attention to complex musical rhythms on an instant-by-instant basis. We use perceptual thresholds and pupillometry as attentional indices against which to test our model predictions. During a deviance detection task, participants listened to continuously looping, multiinstrument, rhythmic patterns, while being eye-tracked. Their task was to respond anytime they heard an increase in intensity (dB SPL). An adaptive thresholding algorithm adjusted deviant intensity at multiple probed temporal locations throughout each rhythmic stimulus. The oscillator model predicted participants' perceptual thresholds for detecting deviants at probed locations, with a low temporal salience prediction corresponding to a high perceptual threshold and vice versa. A pupil dilation response was observed for all deviants. Notably, the pupil dilated even when participants did not report hearing a deviant. Maximum pupil size and resonator model output were significant predictors of whether a deviant was detected or missed on any given trial. Besides the evoked pupillary response to deviants, we also assessed the continuous pupillary signal to the rhythmic patterns. The pupil exhibited entrainment at prominent periodicities present in the stimuli and followed each of the different rhythmic patterns in a unique way. Overall, these results replicate previous studies using the linear oscillator model to predict dynamic attention to complex auditory scenes and extend the utility of the model to the prediction of neurophysiological signals, in this case the pupillary time course; however, we note that the amplitude envelope of the acoustic patterns may serve as a similarly useful predictor. To our knowledge, this is the first paper to show entrainment of pupil dynamics by demonstrating a phase relationship between musical stimuli and the pupillary signal.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA