Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Cell ; 162(3): 648-61, 2015 Jul 30.
Artigo em Inglês | MEDLINE | ID: mdl-26232230

RESUMO

We describe automated technologies to probe the structure of neural tissue at nanometer resolution and use them to generate a saturated reconstruction of a sub-volume of mouse neocortex in which all cellular objects (axons, dendrites, and glia) and many sub-cellular components (synapses, synaptic vesicles, spines, spine apparati, postsynaptic densities, and mitochondria) are rendered and itemized in a database. We explore these data to study physical properties of brain tissue. For example, by tracing the trajectories of all excitatory axons and noting their juxtapositions, both synaptic and non-synaptic, with every dendritic spine we refute the idea that physical proximity is sufficient to predict synaptic connectivity (the so-called Peters' rule). This online minable database provides general access to the intrinsic complexity of the neocortex and enables further data-driven inquiries.


Assuntos
Microscopia Eletrônica de Varredura/métodos , Microtomia/métodos , Neocórtex/ultraestrutura , Neurônios/ultraestrutura , Animais , Automação , Axônios/ultraestrutura , Dendritos/ultraestrutura , Camundongos , Neocórtex/citologia , Sinapses/ultraestrutura , Vesículas Sinápticas/ultraestrutura
2.
Annu Rev Neurosci ; 43: 441-464, 2020 07 08.
Artigo em Inglês | MEDLINE | ID: mdl-32283996

RESUMO

As acquiring bigger data becomes easier in experimental brain science, computational and statistical brain science must achieve similar advances to fully capitalize on these data. Tackling these problems will benefit from a more explicit and concerted effort to work together. Specifically, brain science can be further democratized by harnessing the power of community-driven tools, which both are built by and benefit from many different people with different backgrounds and expertise. This perspective can be applied across modalities and scales and enables collaborations across previously siloed communities.


Assuntos
Big Data , Encéfalo/fisiologia , Biologia Computacional , Rede Nervosa/fisiologia , Animais , Biologia Computacional/métodos , Bases de Dados Genéticas , Expressão Gênica/fisiologia , Humanos
3.
Nature ; 545(7654): 345-349, 2017 05 18.
Artigo em Inglês | MEDLINE | ID: mdl-28489821

RESUMO

High-resolution serial-section electron microscopy (ssEM) makes it possible to investigate the dense meshwork of axons, dendrites, and synapses that form neuronal circuits. However, the imaging scale required to comprehensively reconstruct these structures is more than ten orders of magnitude smaller than the spatial extents occupied by networks of interconnected neurons, some of which span nearly the entire brain. Difficulties in generating and handling data for large volumes at nanoscale resolution have thus restricted vertebrate studies to fragments of circuits. These efforts were recently transformed by advances in computing, sample handling, and imaging techniques, but high-resolution examination of entire brains remains a challenge. Here, we present ssEM data for the complete brain of a larval zebrafish (Danio rerio) at 5.5 days post-fertilization. Our approach utilizes multiple rounds of targeted imaging at different scales to reduce acquisition time and data management requirements. The resulting dataset can be analysed to reconstruct neuronal processes, permitting us to survey all myelinated axons (the projectome). These reconstructions enable precise investigations of neuronal morphology, which reveal remarkable bilateral symmetry in myelinated reticulospinal and lateral line afferent axons. We further set the stage for whole-brain structure-function comparisons by co-registering functional reference atlases and in vivo two-photon fluorescence microscopy data from the same specimen. All obtained images and reconstructions are provided as an open-access resource.


Assuntos
Encéfalo/ultraestrutura , Microscopia Eletrônica , Peixe-Zebra , Anatomia Artística , Animais , Atlas como Assunto , Axônios/metabolismo , Axônios/ultraestrutura , Encéfalo/anatomia & histologia , Encéfalo/citologia , Conjuntos de Dados como Assunto , Larva/anatomia & histologia , Larva/citologia , Larva/ultraestrutura , Microscopia de Fluorescência por Excitação Multifotônica , Publicação de Acesso Aberto , Peixe-Zebra/anatomia & histologia , Peixe-Zebra/crescimento & desenvolvimento
4.
Nature ; 497(7450): 466-9, 2013 May 23.
Artigo em Inglês | MEDLINE | ID: mdl-23698445

RESUMO

The idea of 'frozen-in' magnetic field lines for ideal plasmas is useful to explain diverse astrophysical phenomena, for example the shedding of excess angular momentum from protostars by twisting of field lines frozen into the interstellar medium. Frozen-in field lines, however, preclude the rapid changes in magnetic topology observed at high conductivities, as in solar flares. Microphysical plasma processes are a proposed explanation of the observed high rates, but it is an open question whether such processes can rapidly reconnect astrophysical flux structures much greater in extent than several thousand ion gyroradii. An alternative explanation is that turbulent Richardson advection brings field lines implosively together from distances far apart to separations of the order of gyroradii. Here we report an analysis of a simulation of magnetohydrodynamic turbulence at high conductivity that exhibits Richardson dispersion. This effect of advection in rough velocity fields, which appear non-differentiable in space, leads to line motions that are completely indeterministic or 'spontaneously stochastic', as predicted in analytical studies. The turbulent breakdown of standard flux freezing at scales greater than the ion gyroradius can explain fast reconnection of very large-scale flux structures, both observed (solar flares and coronal mass ejections) and predicted (the inner heliosheath, accretion disks, γ-ray bursts and so on). For laminar plasma flows with smooth velocity fields or for low turbulence intensity, stochastic flux freezing reduces to the usual frozen-in condition.

6.
Nat Commun ; 12(1): 2872, 2021 05 17.
Artigo em Inglês | MEDLINE | ID: mdl-34001899

RESUMO

To solve key biomedical problems, experimentalists now routinely measure millions or billions of features (dimensions) per sample, with the hope that data science techniques will be able to build accurate data-driven inferences. Because sample sizes are typically orders of magnitude smaller than the dimensionality of these data, valid inferences require finding a low-dimensional representation that preserves the discriminating information (e.g., whether the individual suffers from a particular disease). There is a lack of interpretable supervised dimensionality reduction methods that scale to millions of dimensions with strong statistical theoretical guarantees. We introduce an approach to extending principal components analysis by incorporating class-conditional moment estimates into the low-dimensional projection. The simplest version, Linear Optimal Low-rank projection, incorporates the class-conditional means. We prove, and substantiate with both synthetic and real data benchmarks, that Linear Optimal Low-Rank Projection and its generalizations lead to improved data representations for subsequent classification, while maintaining computational efficiency and scalability. Using multiple brain imaging datasets consisting of more than 150 million features, and several genomics datasets with more than 500,000 features, Linear Optimal Low-Rank Projection outperforms other scalable linear dimensionality reduction techniques in terms of accuracy, while only requiring a few minutes on a standard desktop computer.

7.
Gigascience ; 6(5): 1-10, 2017 05 01.
Artigo em Inglês | MEDLINE | ID: mdl-28327935

RESUMO

Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended.


Assuntos
Computação em Nuvem , Ciência , Conectoma , Humanos , Processamento de Imagem Assistida por Computador , Internet , Imageamento por Ressonância Magnética , Software
8.
Sci Data ; 2: 150046, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-26347348

RESUMO

Resurgent interest in synaptic circuitry and plasticity has emphasized the importance of 3D reconstruction from serial section electron microscopy (3DEM). Three volumes of hippocampal CA1 neuropil from adult rat were imaged at X-Y resolution of ~2 nm on serial sections of ~50-60 nm thickness. These are the first densely reconstructed hippocampal volumes. All axons, dendrites, glia, and synapses were reconstructed in a cube (~10 µm(3)) surrounding a large dendritic spine, a cylinder (~43 µm(3)) surrounding an oblique dendritic segment (3.4 µm long), and a parallelepiped (~178 µm(3)) surrounding an apical dendritic segment (4.9 µm long). The data provide standards for identifying ultrastructural objects in 3DEM, realistic reconstructions for modeling biophysical properties of synaptic transmission, and a test bed for enhancing reconstruction tools. Representative synapses are quantified from varying section planes, and microtubules, polyribosomes, smooth endoplasmic reticulum, and endosomes are identified and reconstructed in a subset of dendrites. The original images, traces, and Reconstruct software and files are freely available and visualized at the Open Connectome Project (Data Citation 1).


Assuntos
Hipocampo/anatomia & histologia , Neurópilo , Animais , Processamento de Imagem Assistida por Computador , Microscopia Eletrônica , Ratos , Software
9.
Front Neuroinform ; 9: 20, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-26321942

RESUMO

Reconstructing a map of neuronal connectivity is a critical challenge in contemporary neuroscience. Recent advances in high-throughput serial section electron microscopy (EM) have produced massive 3D image volumes of nanoscale brain tissue for the first time. The resolution of EM allows for individual neurons and their synaptic connections to be directly observed. Recovering neuronal networks by manually tracing each neuronal process at this scale is unmanageable, and therefore researchers are developing automated image processing modules. Thus far, state-of-the-art algorithms focus only on the solution to a particular task (e.g., neuron segmentation or synapse identification). In this manuscript we present the first fully-automated images-to-graphs pipeline (i.e., a pipeline that begins with an imaged volume of neural tissue and produces a brain graph without any human interaction). To evaluate overall performance and select the best parameters and methods, we also develop a metric to assess the quality of the output graphs. We evaluate a set of algorithms and parameters, searching possible operating points to identify the best available brain graph for our assessment metric. Finally, we deploy a reference end-to-end version of the pipeline on a large, publicly available data set. This provides a baseline result and framework for community analysis and future algorithm development and testing. All code and data derivatives have been made publicly available in support of eventually unlocking new biofidelic computational primitives and understanding of neuropathologies.

10.
Neuron ; 83(6): 1249-52, 2014 Sep 17.
Artigo em Inglês | MEDLINE | ID: mdl-25233306

RESUMO

The analysis of data requires computation: originally by hand and more recently by computers. Different models of computing are designed and optimized for different kinds of data. In data-intensive science, the scale and complexity of data exceeds the comfort zone of local data stores on scientific workstations. Thus, cloud computing emerges as the preeminent model, utilizing data centers and high-performance clusters, enabling remote users to access and query subsets of the data efficiently. We examine how data-intensive computational systems originally built for cosmology, the Sloan Digital Sky Survey (SDSS), are now being used in connectomics, at the Open Connectome Project. We list lessons learned and outline the top challenges we expect to face. Success in computational connectomics would drastically reduce the time between idea and discovery, as SDSS did in cosmology.


Assuntos
Computadores , Conectoma/métodos , Sistemas de Informação , Software , Estatística como Assunto/métodos , Animais , Biologia Computacional/métodos , Humanos
11.
Sci Data ; 1: 140046, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-25977797

RESUMO

A major question in neuroscience is how diverse subsets of synaptic connections in neural circuits are affected by experience dependent plasticity to form the basis for behavioral learning and memory. Differences in protein expression patterns at individual synapses could constitute a key to understanding both synaptic diversity and the effects of plasticity at different synapse populations. Our approach to this question leverages the immunohistochemical multiplexing capability of array tomography (ATomo) and the columnar organization of mouse barrel cortex to create a dataset comprising high resolution volumetric images of spared and deprived cortical whisker barrels stained for over a dozen synaptic molecules each. These dataset has been made available through the Open Connectome Project for interactive online viewing, and may also be downloaded for offline analysis using web, Matlab, and other interfaces.


Assuntos
Córtex Somatossensorial/química , Sinapses/química , Animais , Aprendizagem , Memória , Camundongos , Plasticidade Neuronal , Córtex Somatossensorial/fisiologia , Sinapses/fisiologia , Tomografia Computadorizada por Raios X
12.
ICS ; 2013.
Artigo em Inglês | MEDLINE | ID: mdl-24402052

RESUMO

We describe a storage system that removes I/O bottlenecks to achieve more than one million IOPS based on a user-space file abstraction for arrays of commodity SSDs. The file abstraction refactors I/O scheduling and placement for extreme parallelism and non-uniform memory and I/O. The system includes a set-associative, parallel page cache in the user space. We redesign page caching to eliminate CPU overhead and lock-contention in non-uniform memory architecture machines. We evaluate our design on a 32 core NUMA machine with four, eight-core processors. Experiments show that our design delivers 1.23 million 512-byte read IOPS. The page cache realizes the scalable IOPS of Linux asynchronous I/O (AIO) and increases user-perceived I/O performance linearly with cache hit rates. The parallel, set-associative cache matches the cache hit rates of the global Linux page cache under real workloads.

13.
Artigo em Inglês | MEDLINE | ID: mdl-24401992

RESUMO

We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes- neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA