Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
Biol Reprod ; 110(6): 1041-1054, 2024 Jun 12.
Artigo em Inglês | MEDLINE | ID: mdl-38159104

RESUMO

New microscopy techniques in combination with tissue clearing protocols and emerging analytical approaches have presented researchers with the tools to understand dynamic biological processes in a three-dimensional context. This paves the road for the exploration of new research questions in reproductive biology, for which previous techniques have provided only approximate resolution. These new methodologies now allow for contextualized analysis of far-larger volumes than was previously possible. Tissue optical clearing and three-dimensional imaging techniques posit the bridging of molecular mechanisms, macroscopic morphogenic development, and maintenance of reproductive function into one cohesive and comprehensive understanding of the biology of the reproductive system. In this review, we present a survey of the various tissue clearing techniques and imaging systems, as they have been applied to the developing and adult reproductive system. We provide an overview of tools available for analysis of experimental data, giving particular attention to the emergence of artificial intelligence-assisted methods and their applicability to image analysis. We conclude with an evaluation of how novel image analysis approaches that have been applied to other organ systems could be incorporated into future experimental evaluation of reproductive biology.


Assuntos
Genitália , Imageamento Tridimensional , Animais , Genitália/diagnóstico por imagem , Imageamento Tridimensional/métodos , Humanos , Reprodução/fisiologia , Feminino , Processamento de Imagem Assistida por Computador/métodos
2.
bioRxiv ; 2024 May 14.
Artigo em Inglês | MEDLINE | ID: mdl-38798456

RESUMO

The number and distribution of ovarian follicles in each growth stage provides a reliable readout of ovarian health and function. Leveraging techniques for three-dimensional (3D) imaging of ovaries in toto has the potential to uncover total, accurate ovarian follicle counts. However, because of the size and holistic nature of these images, counting oocytes is time consuming and difficult. The advent of deep-learning algorithms has allowed for the rapid development of ultra-fast, automated methods to analyze microscopy images. In recent years, these pipelines have become more user-friendly and accessible to non-specialists. We used these tools to create OoCount, a high-throughput, open-source method for automatic oocyte segmentation and classification from fluorescent 3D microscopy images of whole mouse ovaries using a deep-learning convolutional neural network (CNN) based approach. We developed a fast tissue-clearing and spinning disk confocal-based imaging protocol to obtain 3D images of whole mount perinatal and adult mouse ovaries. Fluorescently labeled oocytes from 3D images of ovaries were manually annotated in Napari to develop a machine learning training dataset. This dataset was used to retrain StarDist using a CNN within DL4MicEverywhere to automatically label all oocytes in the ovary. In a second phase, we utilize Accelerated Pixel and Object Classification, a Napari plugin, to classify labeled oocytes and sort them into growth stages. Here, we provide an end-to-end protocol for producing high-quality 3D images of the perinatal and adult mouse ovary, obtaining follicle counts and staging. We also demonstrate how to customize OoCount to fit images produced in any lab. Using OoCount, we can obtain accurate counts of oocytes in each growth stage in the perinatal and adult ovary, improving our ability to study ovarian function and fertility. Summary sentence: This protocol introduces OoCount, a high-throughput, open-source method for automatic oocyte segmentation and classification from fluorescent 3D microscopy images of whole mouse ovaries using a machine learning-based approach.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA