Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros

Base de dados
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
Nat Methods ; 18(5): 564-573, 2021 05.
Artigo em Inglês | MEDLINE | ID: mdl-33875887

RESUMO

Comprehensive descriptions of animal behavior require precise three-dimensional (3D) measurements of whole-body movements. Although two-dimensional approaches can track visible landmarks in restrictive environments, performance drops in freely moving animals, due to occlusions and appearance changes. Therefore, we designed DANNCE to robustly track anatomical landmarks in 3D across species and behaviors. DANNCE uses projective geometry to construct inputs to a convolutional neural network that leverages learned 3D geometric reasoning. We trained and benchmarked DANNCE using a dataset of nearly seven million frames that relates color videos and rodent 3D poses. In rats and mice, DANNCE robustly tracked dozens of landmarks on the head, trunk, and limbs of freely moving animals in naturalistic settings. We extended DANNCE to datasets from rat pups, marmosets, and chickadees, and demonstrate quantitative profiling of behavioral lineage during development.


Assuntos
Aprendizado Profundo , Processamento de Imagem Assistida por Computador , Atividade Motora , Animais , Fenômenos Biomecânicos , Gravação em Vídeo
2.
Nat Methods ; 16(1): 117-125, 2019 01.
Artigo em Inglês | MEDLINE | ID: mdl-30573820

RESUMO

The need for automated and efficient systems for tracking full animal pose has increased with the complexity of behavioral data and analyses. Here we introduce LEAP (LEAP estimates animal pose), a deep-learning-based method for predicting the positions of animal body parts. This framework consists of a graphical interface for labeling of body parts and training the network. LEAP offers fast prediction on new data, and training with as few as 100 frames results in 95% of peak performance. We validated LEAP using videos of freely behaving fruit flies and tracked 32 distinct points to describe the pose of the head, body, wings and legs, with an error rate of <3% of body length. We recapitulated reported findings on insect gait dynamics and demonstrated LEAP's applicability for unsupervised behavioral classification. Finally, we extended the method to more challenging imaging situations and videos of freely moving mice.


Assuntos
Comportamento Animal , Aprendizado Profundo , Drosophila melanogaster/fisiologia , Redes Neurais de Computação , Reconhecimento Automatizado de Padrão/métodos , Algoritmos , Animais , Automação , Gráficos por Computador , Marcha , Locomoção , Masculino , Camundongos , Interface Usuário-Computador
3.
Neuron ; 109(3): 420-437.e8, 2021 02 03.
Artigo em Inglês | MEDLINE | ID: mdl-33340448

RESUMO

In mammalian animal models, high-resolution kinematic tracking is restricted to brief sessions in constrained environments, limiting our ability to probe naturalistic behaviors and their neural underpinnings. To address this, we developed CAPTURE (Continuous Appendicular and Postural Tracking Using Retroreflector Embedding), a behavioral monitoring system that combines motion capture and deep learning to continuously track the 3D kinematics of a rat's head, trunk, and limbs for week-long timescales in freely behaving animals. CAPTURE realizes 10- to 100-fold gains in precision and robustness compared with existing convolutional network approaches to behavioral tracking. We demonstrate CAPTURE's ability to comprehensively profile the kinematics and sequential organization of natural rodent behavior, its variation across individuals, and its perturbation by drugs and disease, including identifying perseverative grooming states in a rat model of fragile X syndrome. CAPTURE significantly expands the range of behaviors and contexts that can be quantitatively investigated, opening the door to a new understanding of natural behavior and its neural basis.


Assuntos
Comportamento Animal/fisiologia , Movimento/fisiologia , Animais , Fenômenos Biomecânicos/fisiologia , Asseio Animal/fisiologia , Ratos
4.
Curr Biol ; 28(15): 2400-2412.e6, 2018 08 06.
Artigo em Inglês | MEDLINE | ID: mdl-30057309

RESUMO

Deciphering how brains generate behavior depends critically on an accurate description of behavior. If distinct behaviors are lumped together, separate modes of brain activity can be wrongly attributed to the same behavior. Alternatively, if a single behavior is split into two, the same neural activity can appear to produce different behaviors. Here, we address this issue in the context of acoustic communication in Drosophila. During courtship, males vibrate their wings to generate time-varying songs, and females evaluate songs to inform mating decisions. For 50 years, Drosophila melanogaster song was thought to consist of only two modes, sine and pulse, but using unsupervised classification methods on large datasets of song recordings, we now establish the existence of at least three song modes: two distinct pulse types, along with a single sine mode. We show how this seemingly subtle distinction affects our interpretation of the mechanisms underlying song production and perception. Specifically, we show that visual feedback influences the probability of producing each song mode and that male song mode choice affects female responses and contributes to modulating his song amplitude with distance. At the neural level, we demonstrate how the activity of four separate neuron types within the fly's song pathway differentially affects the probability of producing each song mode. Our results highlight the importance of carefully segmenting behavior to map the underlying sensory, neural, and genetic mechanisms.


Assuntos
Comunicação Animal , Drosophila melanogaster/fisiologia , Neurônios Motores/fisiologia , Comportamento Sexual Animal/fisiologia , Animais , Corte
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA