Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
1.
Nature ; 601(7894): 595-599, 2022 01.
Artículo en Inglés | MEDLINE | ID: mdl-34937941

RESUMEN

Odours are a fundamental part of the sensory environment used by animals to guide behaviours such as foraging and navigation1,2. Primary olfactory (piriform) cortex is thought to be the main cortical region for encoding odour identity3-8. Here, using neural ensemble recordings in freely moving rats performing an odour-cued spatial choice task, we show that posterior piriform cortex neurons carry a robust spatial representation of the environment. Piriform spatial representations have features of a learned cognitive map, being most prominent near odour ports, stable across behavioural contexts and independent of olfactory drive or reward availability. The accuracy of spatial information carried by individual piriform neurons was predicted by the strength of their functional coupling to the hippocampal theta rhythm. Ensembles of piriform neurons concurrently represented odour identity as well as spatial locations of animals, forming an odour-place map. Our results reveal a function for piriform cortex in spatial cognition and suggest that it is well-suited to form odour-place associations and guide olfactory-cued spatial navigation.


Asunto(s)
Corteza Olfatoria , Corteza Piriforme , Navegación Espacial , Animales , Odorantes , Bulbo Olfatorio/fisiología , Corteza Olfatoria/fisiología , Vías Olfatorias/fisiología , Corteza Piriforme/fisiología , Ratas , Olfato/fisiología
2.
Nat Methods ; 20(3): 403-407, 2023 03.
Artículo en Inglés | MEDLINE | ID: mdl-36864199

RESUMEN

We describe an architecture for organizing, integrating and sharing neurophysiology data within a single laboratory or across a group of collaborators. It comprises a database linking data files to metadata and electronic laboratory notes; a module collecting data from multiple laboratories into one location; a protocol for searching and sharing data and a module for automatic analyses that populates a website. These modules can be used together or individually, by single laboratories or worldwide collaborations.


Asunto(s)
Laboratorios , Neurofisiología , Bases de Datos Factuales
3.
PLoS Comput Biol ; 17(9): e1009439, 2021 09.
Artículo en Inglés | MEDLINE | ID: mdl-34550974

RESUMEN

Recent neuroscience studies demonstrate that a deeper understanding of brain function requires a deeper understanding of behavior. Detailed behavioral measurements are now often collected using video cameras, resulting in an increased need for computer vision algorithms that extract useful information from video data. Here we introduce a new video analysis tool that combines the output of supervised pose estimation algorithms (e.g. DeepLabCut) with unsupervised dimensionality reduction methods to produce interpretable, low-dimensional representations of behavioral videos that extract more information than pose estimates alone. We demonstrate this tool by extracting interpretable behavioral features from videos of three different head-fixed mouse preparations, as well as a freely moving mouse in an open field arena, and show how these interpretable features can facilitate downstream behavioral and neural analyses. We also show how the behavioral features produced by our model improve the precision and interpretation of these downstream analyses compared to using the outputs of either fully supervised or fully unsupervised methods alone.


Asunto(s)
Algoritmos , Inteligencia Artificial/estadística & datos numéricos , Conducta Animal , Grabación en Video , Animales , Biología Computacional , Simulación por Computador , Cadenas de Markov , Ratones , Modelos Estadísticos , Redes Neurales de la Computación , Aprendizaje Automático Supervisado/estadística & datos numéricos , Aprendizaje Automático no Supervisado/estadística & datos numéricos , Grabación en Video/estadística & datos numéricos
4.
Commun Med (Lond) ; 2: 34, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35603293

RESUMEN

Background: Key to curtailing the COVID-19 pandemic are wide-scale screening strategies. An ideal screen is one that would not rely on transporting, distributing, and collecting physical specimens. Given the olfactory impairment associated with COVID-19, we developed a perceptual measure of olfaction that relies on smelling household odorants and rating them online. Methods: Each participant was instructed to select 5 household items, and rate their perceived odor pleasantness and intensity using an online visual analogue scale. We used this data to assign an olfactory perceptual fingerprint, a value that reflects the perceived difference between odorants. We tested the performance of this real-time tool in a total of 13,484 participants (462 COVID-19 positive) from 134 countries who provided 178,820 perceptual ratings of 60 different household odorants. Results: We observe that olfactory ratings are indicative of COVID-19 status in a country, significantly correlating with national infection rates over time. More importantly, we observe indicative power at the individual level (79% sensitivity and 87% specificity). Critically, this olfactory screen remains effective in participants with COVID-19 but without symptoms, and in participants with symptoms but without COVID-19. Conclusions: The current odorant-based olfactory screen adds a component to online symptom-checkers, to potentially provide an added first line of defense that can help fight disease progression at the population level. The data derived from this tool may allow better understanding of the link between COVID-19 and olfaction.

5.
Elife ; 102021 05 20.
Artículo en Inglés | MEDLINE | ID: mdl-34011433

RESUMEN

Progress in science requires standardized assays whose results can be readily shared, compared, and reproduced across laboratories. Reproducibility, however, has been a concern in neuroscience, particularly for measurements of mouse behavior. Here, we show that a standardized task to probe decision-making in mice produces reproducible results across multiple laboratories. We adopted a task for head-fixed mice that assays perceptual and value-based decision making, and we standardized training protocol and experimental hardware, software, and procedures. We trained 140 mice across seven laboratories in three countries, and we collected 5 million mouse choices into a publicly available database. Learning speed was variable across mice and laboratories, but once training was complete there were no significant differences in behavior across laboratories. Mice in different laboratories adopted similar reliance on visual stimuli, on past successes and failures, and on estimates of stimulus prior probability to guide their choices. These results reveal that a complex mouse behavior can be reproduced across multiple laboratories. They establish a standard for reproducible rodent behavior, and provide an unprecedented dataset and open-access tools to study decision-making in mice. More generally, they indicate a path toward achieving reproducibility in neuroscience through collaborative open-science approaches.


In science, it is of vital importance that multiple studies corroborate the same result. Researchers therefore need to know all the details of previous experiments in order to implement the procedures as exactly as possible. However, this is becoming a major problem in neuroscience, as animal studies of behavior have proven to be hard to reproduce, and most experiments are never replicated by other laboratories. Mice are increasingly being used to study the neural mechanisms of decision making, taking advantage of the genetic, imaging and physiological tools that are available for mouse brains. Yet, the lack of standardized behavioral assays is leading to inconsistent results between laboratories. This makes it challenging to carry out large-scale collaborations which have led to massive breakthroughs in other fields such as physics and genetics. To help make these studies more reproducible, the International Brain Laboratory (a collaborative research group) et al. developed a standardized approach for investigating decision making in mice that incorporates every step of the process; from the training protocol to the software used to analyze the data. In the experiment, mice were shown images with different contrast and had to indicate, using a steering wheel, whether it appeared on their right or left. The mice then received a drop of sugar water for every correction decision. When the image contrast was high, mice could rely on their vision. However, when the image contrast was very low or zero, they needed to consider the information of previous trials and choose the side that had recently appeared more frequently. This method was used to train 140 mice in seven laboratories from three different countries. The results showed that learning speed was different across mice and laboratories, but once training was complete the mice behaved consistently, relying on visual stimuli or experiences to guide their choices in a similar way. These results show that complex behaviors in mice can be reproduced across multiple laboratories, providing an unprecedented dataset and open-access tools for studying decision making. This work could serve as a foundation for other groups, paving the way to a more collaborative approach in the field of neuroscience that could help to tackle complex research challenges.


Asunto(s)
Conducta Animal , Investigación Biomédica/normas , Toma de Decisiones , Neurociencias/normas , Animales , Señales (Psicología) , Femenino , Aprendizaje , Masculino , Ratones Endogámicos C57BL , Modelos Animales , Variaciones Dependientes del Observador , Estimulación Luminosa , Reproducibilidad de los Resultados , Factores de Tiempo , Percepción Visual
7.
Front Neuroinform ; 9: 7, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-25904861

RESUMEN

The design of modern scientific experiments requires the control and monitoring of many different data streams. However, the serial execution of programming instructions in a computer makes it a challenge to develop software that can deal with the asynchronous, parallel nature of scientific data. Here we present Bonsai, a modular, high-performance, open-source visual programming framework for the acquisition and online processing of data streams. We describe Bonsai's core principles and architecture and demonstrate how it allows for the rapid and flexible prototyping of integrated experimental designs in neuroscience. We specifically highlight some applications that require the combination of many different hardware and software components, including video tracking of behavior, electrophysiology and closed-loop control of stimulation.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA