RESUMO
SNT is an end-to-end framework for neuronal morphometry and whole-brain connectomics that supports tracing, proof-editing, visualization, quantification and modeling of neuroanatomy. With an open architecture, a large user base, community-based documentation, support for complex imagery and several model organisms, SNT is a flexible resource for the broad neuroscience community. SNT is both a desktop application and multi-language scripting library, and it is available through the Fiji distribution of ImageJ.
Assuntos
Encéfalo/anatomia & histologia , Neurônios/citologia , Animais , Encéfalo/citologia , Conectoma , Humanos , Análise de Célula ÚnicaRESUMO
To quantitatively understand biological processes that occur over many hours or days, it is desirable to image multiple samples simultaneously, and automatically process and analyse the resulting datasets. Here, we present a complete multi-sample preparation, imaging, processing and analysis workflow to determine the development of the vascular volume in zebrafish. Up to five live embryos were mounted and imaged simultaneously over several days using selective plane illumination microscopy (SPIM). The resulting large imagery dataset of several terabytes was processed in an automated manner on a high-performance computer cluster and segmented using a novel segmentation approach that uses images of red blood cells as training data. This analysis yielded a precise quantification of growth characteristics of the whole vascular network, head vasculature and tail vasculature over development. Our multi-sample platform demonstrates effective upgrades to conventional single-sample imaging platforms and paves the way for diverse quantitative long-term imaging studies.
Assuntos
Sistema Cardiovascular/embriologia , Processamento de Imagem Assistida por Computador/métodos , Microscopia de Fluorescência/métodos , Animais , Fenômenos Biológicos , Análise por Conglomerados , Embrião não Mamífero , Proteínas de Fluorescência Verde/metabolismo , Software , Peixe-ZebraRESUMO
Modern microscopes create a data deluge with gigabytes of data generated each second, and terabytes per day. Storing and processing this data is a severe bottleneck, not fully alleviated by data compression. We argue that this is because images are processed as grids of pixels. To address this, we propose a content-adaptive representation of fluorescence microscopy images, the Adaptive Particle Representation (APR). The APR replaces pixels with particles positioned according to image content. The APR overcomes storage bottlenecks, as data compression does, but additionally overcomes memory and processing bottlenecks. Using noisy 3D images, we show that the APR adaptively represents the content of an image while maintaining image quality and that it enables orders of magnitude benefits across a range of image processing tasks. The APR provides a simple and efficient content-aware representation of fluosrescence microscopy images.