Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
Comput Biol Med ; 180: 108967, 2024 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-39111154

RESUMO

BACKGROUND AND OBJECTIVE: Papanicolaou staining has been successfully used to assist early detection of cervix cancer for several decades. We postulate that this staining technique can also be used for assisting early detection of oral cancer, which is responsible for about 300,000 deaths every year. The rational for such claim includes two key observations: (i) nuclear atypia, i.e., changes in volume, shape, and staining properties of the cell nuclei can be linked to rapid cell proliferation and genetic instability; and (ii) Papanicolaou staining allows one to reliably segment cells' nuclei and cytoplasms. While Papanicolaou staining is an attractive tool due to its low cost, its interpretation requires a trained pathologist. Our goal is to automate the segmentation and classification of morphological features needed to evaluate the use of Papanicolaou staining for early detection of mouth cancer. METHODS: We built a convolutional neural network (CNN) for automatic segmentation and classification of cells in Papanicolaou-stained images. Our CNN was trained and evaluated on a new image dataset of cells from oral mucosa consisting of 1,563 Full HD images from 52 patients, annotated by specialists. The effectiveness of our model was evaluated against a group of experts. Its robustness was also demonstrated on five public datasets of cervical images captured with different microscopes and cameras, and having different resolutions, colors, background intensities, and noise levels. RESULTS: Our CNN model achieved expert-level performance in a comparison with a group of three human experts on a set of 400 Papanicolaou-stained images of the oral mucosa from 20 patients. The results of this experiment exhibited high Interclass Correlation Coefficient (ICC) values. Despite being trained on images from the oral mucosa, it produced high-quality segmentation and plausible classification for five public datasets of cervical cells. Our Papanicolaou-stained image dataset is the most diverse publicly available image dataset for the oral mucosa in terms of number of patients. CONCLUSION: Our solution provides the means for exploring the potential of Papanicolaou-staining as a powerful and inexpensive tool for early detection of oral cancer. We are currently using our system to detect suspicious cells and cell clusters in oral mucosa slide images. Our trained model, code, and dataset are available and can help practitioners and stimulate research in early oral cancer detection.


Assuntos
Neoplasias Bucais , Teste de Papanicolaou , Humanos , Neoplasias Bucais/patologia , Neoplasias Bucais/diagnóstico por imagem , Feminino , Redes Neurais de Computação , Processamento de Imagem Assistida por Computador/métodos , Coloração e Rotulagem/métodos , Detecção Precoce de Câncer/métodos
2.
Comput Methods Programs Biomed ; 242: 107788, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37738838

RESUMO

BACKGROUND AND OBJECTIVE: Oral cancer is the sixth most common kind of human cancer. Brush cytology for counting Argyrophilic Nucleolar Organizer Regions (AgNORs) can help early mouth cancer detection, lowering patient mortality. However, the manual counting of AgNORs still in use today is time-consuming, labor-intensive, and error-prone. The goal of our work is to address these shortcomings by proposing a convolutional neural network (CNN) based method to automatically segment individual nuclei and AgNORs in microscope slide images and count the number of AgNORs within each nucleus. METHODS: We systematically defined, trained and tested 102 CNNs in the search for a high-performing solution. This included the evaluation of 51 network architectures combining 17 encoders with 3 decoders and 2 loss functions. These CNNs were trained and evaluated on a new AgNOR-stained image dataset of epithelial cells from oral mucosa containing 1,171 images from 48 patients, with ground truth annotated by specialists. The annotations were greatly facilitated by a semi-automatic procedure developed in our project. Overlapping nuclei, which tend to hide AgNORs, thus affecting their true count, were discarded using an automatic solution also developed in our project. Besides the evaluation on the test dataset, the robustness of the best performing model was evaluated against the results produced by a group of human experts on a second dataset. RESULTS: The best performing CNN model on the test dataset consisted of a DenseNet-169 + LinkNet with Focal Loss (DenseNet-169 as encoder and LinkNet as decoder). It obtained a Dice score of 0.90 and intersection over union (IoU) of 0.84. The counting of nuclei and AgNORs achieved precision and recall of 0.94 and 0.90 for nuclei, and 0.82 and 0.74 for AgNORs, respectively. Our solution achieved a performance similar to human experts on a set of 291 images from 6 new patients, obtaining Intraclass Correlation Coefficient (ICC) of 0.91 for nuclei and 0.81 for AgNORs with 95% confidence intervals of [0.89, 0.93] and [0.77, 0.84], respectively, and p-values < 0.001, confirming its statistical significance. Our AgNOR-stained image dataset is the most diverse publicly available AgNOR-stained image dataset in terms of number of patients and the first for oral cells. CONCLUSIONS: CNN-based joint segmentation and quantification of nuclei and NORs in AgNOR-stained images achieves expert-like performance levels, while being orders of magnitude faster than the later. Our solution demonstrated this by showing strong agreement with the results produced by a group of specialists, highlighting its potential to accelerate diagnostic workflows. Our trained model, code, and dataset are available and can stimulate new research in early oral cancer detection.


Assuntos
Neoplasias Bucais , Região Organizadora do Nucléolo , Humanos , Coloração pela Prata/métodos , Neoplasias Bucais/diagnóstico por imagem , Redes Neurais de Computação
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA