Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Front Psychol ; 13: 901900, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36017441

RESUMO

Although we frequently acquire knowledge and skills through social interactions, the focus of most research on learning is on individual learning. Here we characterize Interaction Based Learning (IBL), which represents the acquisition of knowledge or skill through social interactions, and compare it to Observational Learning (OL)-learning by observation. To that end, we designed a movement synchronization paradigm whereby participants learned Tai-Chi inspired movement sequences from trained teachers in two separated sessions. We used a motion capture system to track the movement of 40 dyads comprised of a teacher and learner, who were randomly divided into OL or IBL groups, and calculated time-varying synchrony of three-dimensional movement velocity. While in the IBL group both the learner and the teacher could see each other through a transparent glass, in the OL group dyads interacted through a one-way mirror, such that the learners observed the teacher, but the teacher could not see the learners. Results show that although the number of movements recalled was not different between groups, we found improved movement smoothness in the IBL compared to the OL group, indicating movement acquisition was better in the IBL group. In addition, we found that motor synchronization levels in dyads improved over time, indicating that movement synchronization can be learned and retained. In the first session, the IBL group, but not the OL group, showed a significant improvement in synchronization. This suggests that dyadic interaction is important for learning movement sequences, and that bidirectional communication of signals and mutual feedback are essential for the consolidation of motor learning.

2.
BMC Biol ; 20(1): 159, 2022 07 12.
Artigo em Inglês | MEDLINE | ID: mdl-35820848

RESUMO

BACKGROUND: Various mammalian species emit ultrasonic vocalizations (USVs), which reflect their emotional state and mediate social interactions. USVs are usually analyzed by manual or semi-automated methodologies that categorize discrete USVs according to their structure in the frequency-time domains. This laborious analysis hinders the effective use of USVs as a readout for high-throughput analysis of behavioral changes in animals. RESULTS: Here we present a novel automated open-source tool that utilizes a different approach towards USV analysis, termed TrackUSF. To validate TrackUSF, we analyzed calls from different animal species, namely mice, rats, and bats, recorded in various settings and compared the results with a manual analysis by a trained observer. We found that TrackUSF detected the majority of USVs, with less than 1% of false-positive detections. We then employed TrackUSF to analyze social vocalizations in Shank3-deficient rats, a rat model of autism, and revealed that these vocalizations exhibit a spectrum of deviations from appetitive calls towards aversive calls. CONCLUSIONS: TrackUSF is a simple and easy-to-use system that may be used for a high-throughput comparison of ultrasonic vocalizations between groups of animals of any kind in any setting, with no prior assumptions.


Assuntos
Transtorno Autístico , Ultrassom , Animais , Emoções , Mamíferos , Camundongos , Proteínas dos Microfilamentos , Proteínas do Tecido Nervoso , Ratos , Vocalização Animal
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...