Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
bioRxiv ; 2024 May 29.
Artigo em Inglês | MEDLINE | ID: mdl-38854106

RESUMO

Chromosomal instability (CIN) is a hallmark of cancer that drives metastasis, immune evasion and treatment resistance. CIN results from chromosome mis-segregation events during anaphase, as excessive chromatin is packaged in micronuclei (MN), that can be enumerated to quantify CIN. Despite recent advancements in automation through computer vision and machine learning, the assessment of CIN remains a predominantly manual and time-consuming task, thus hampering important work in the field. Here, we present micronuclAI , a novel pipeline for automated and reliable quantification of MN of varying size, morphology and location from DNA-only stained images. In micronucleAI , single-cell crops are extracted from high-resolution microscopy images with the help of segmentation masks, which are then used to train a convolutional neural network (CNN) to output the number of MN associated with each cell. The pipeline was evaluated against manual single-cell level counts by experts and against routinely used MN ratio within the complete image. The classifier was able to achieve a weighted F1 score of 0.937 on the test dataset and the complete pipeline can achieve close to human-level performance on various datasets derived from multiple human and murine cancer cell lines. The pipeline achieved a root-mean-square deviation (RMSE) value of 0.0041, an R 2 of 0.87 and a Pearson's correlation of 0.938 on images obtained at 10X magnification. We tested the approach in otherwise isogenic cell lines in which we genetically dialed up or down CIN rates, and also on a publicly available image data set (obtained at 100X) and achieved an RMSE value of 0.0159, an R 2 of 0.90, and a Pearson's correlation of 0.951. Given the increasing interest in developing therapies for CIN-driven cancers, this method provides an important, scalable, and rapid approach to quantifying CIN on routinely obtained images. We release a GUI-implementation for easy access and utilization of the pipeline.

2.
ArXiv ; 2024 Feb 08.
Artigo em Inglês | MEDLINE | ID: mdl-38351940

RESUMO

Together with the molecular knowledge of genes and proteins, biological images promise to significantly enhance the scientific understanding of complex cellular systems and to advance predictive and personalized therapeutic products for human health. For this potential to be realized, quality-assured image data must be shared among labs at a global scale to be compared, pooled, and reanalyzed, thus unleashing untold potential beyond the original purpose for which the data was generated. There are two broad sets of requirements to enable image data sharing in the life sciences. One set of requirements is articulated in the companion White Paper entitled "Enabling Global Image Data Sharing in the Life Sciences," which is published in parallel and addresses the need to build the cyberinfrastructure for sharing the digital array data (arXiv:2401.13023 [q-bio.OT], https://doi.org/10.48550/arXiv.2401.13023). In this White Paper, we detail a broad set of requirements, which involves collecting, managing, presenting, and propagating contextual information essential to assess the quality, understand the content, interpret the scientific implications, and reuse image data in the context of the experimental details. We start by providing an overview of the main lessons learned to date through international community activities, which have recently made considerable progress toward generating community standard practices for imaging Quality Control (QC) and metadata. We then provide a clear set of recommendations for amplifying this work. The driving goal is to address remaining challenges, and democratize access to common practices and tools for a spectrum of biomedical researchers, regardless of their expertise, access to resources, and geographical location.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA