Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
Diagnostics (Basel) ; 13(23)2023 Nov 27.
Artigo em Inglês | MEDLINE | ID: mdl-38066780

RESUMO

(1) Background: The categorization of recurrent and non-recurrent odontogenic keratocyst is complex and challenging for both clinicians and pathologists. What sets this cyst apart is its aggressive nature and high likelihood of recurrence. Despite identifying various predictive clinical/radiological/histopathological parameters, clinicians still face difficulties in therapeutic management due to its inherent aggressive nature. This research aims to build a pipeline system that accurately detects recurring and non-recurring OKC. (2) Objective: To automate the risk stratification of OKCs as recurring or non-recurring based on whole slide images (WSIs) using an attention-based image sequence analyzer (ABISA). (3) Materials and methods: The presented architecture combines transformer-based self-attention mechanisms with sequential modeling using LSTM (long short-term memory) to predict the class label. This architecture leverages self-attention to capture spatial dependencies in image patches and LSTM to capture sequential dependencies across patches or frames, making it suitable for this image analysis. These two powerful combinations were integrated and applied on a custom dataset of 48 labeled WSIs (508 tiled images) generated from the highest zoom level WSI. (4) Results: The proposed ABISA algorithm attained 0.98, 1.0, and 0.98 testing accuracy, recall, and area under the curve, respectively, whereas VGG16, VGG19, and Inception V3, standard vision transformer attained testing accuracies of 0.80, 0.73, 0.82, 0.91, respectively. ABISA used 58% fewer trainable parameters than the standard vision transformer. (5) Conclusions: The proposed novel ABISA algorithm was integrated into a risk stratification pipeline to automate the detection of recurring OKC significantly faster, thus allowing the pathologist to define risk stratification faster.

2.
Diagnostics (Basel) ; 13(21)2023 Nov 04.
Artigo em Inglês | MEDLINE | ID: mdl-37958281

RESUMO

The microscopic diagnostic differentiation of odontogenic cysts from other cysts is intricate and may cause perplexity for both clinicians and pathologists. Of particular interest is the odontogenic keratocyst (OKC), a developmental cyst with unique histopathological and clinical characteristics. Nevertheless, what distinguishes this cyst is its aggressive nature and high tendency for recurrence. Clinicians encounter challenges in dealing with this frequently encountered jaw lesion, as there is no consensus on surgical treatment. Therefore, the accurate and early diagnosis of such cysts will benefit clinicians in terms of treatment management and spare subjects from the mental agony of suffering from aggressive OKCs, which impact their quality of life. The objective of this research is to develop an automated OKC diagnostic system that can function as a decision support tool for pathologists, whether they are working locally or remotely. This system will provide them with additional data and insights to enhance their decision-making abilities. This research aims to provide an automation pipeline to classify whole-slide images of OKCs and non-keratocysts (non-KCs: dentigerous and radicular cysts). OKC diagnosis and prognosis using the histopathological analysis of tissues using whole-slide images (WSIs) with a deep-learning approach is an emerging research area. WSIs have the unique advantage of magnifying tissues with high resolution without losing information. The contribution of this research is a novel, deep-learning-based, and efficient algorithm that reduces the trainable parameters and, in turn, the memory footprint. This is achieved using principal component analysis (PCA) and the ReliefF feature selection algorithm (ReliefF) in a convolutional neural network (CNN) named P-C-ReliefF. The proposed model reduces the trainable parameters compared to standard CNN, achieving 97% classification accuracy.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA