Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
Sensors (Basel) ; 23(11)2023 May 26.
Artigo em Inglês | MEDLINE | ID: mdl-37299826

RESUMO

The preoperative differentiation of breast phyllodes tumors (PTs) from fibroadenomas (FAs) plays a critical role in identifying an appropriate surgical treatment. Although several imaging modalities are available, reliable differentiation between PT and FA remains a great challenge for radiologists in clinical work. Artificial intelligence (AI)-assisted diagnosis has shown promise in distinguishing PT from FA. However, a very small sample size was adopted in previous studies. In this work, we retrospectively enrolled 656 breast tumors (372 FAs and 284 PTs) with 1945 ultrasound images in total. Two experienced ultrasound physicians independently evaluated the ultrasound images. Meanwhile, three deep-learning models (i.e., ResNet, VGG, and GoogLeNet) were applied to classify FAs and PTs. The robustness of the models was evaluated by fivefold cross validation. The performance of each model was assessed by using the receiver operating characteristic (ROC) curve. The area under the curve (AUC), accuracy, sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were also calculated. Among the three models, the ResNet model yielded the highest AUC value, of 0.91, with an accuracy value of 95.3%, a sensitivity value of 96.2%, and a specificity value of 94.7% in the testing data set. In contrast, the two physicians yielded an average AUC value of 0.69, an accuracy value of 70.7%, a sensitivity value of 54.4%, and a specificity value of 53.2%. Our findings indicate that the diagnostic performance of deep learning is better than that of physicians in the distinction of PTs from FAs. This further suggests that AI is a valuable tool for aiding clinical diagnosis, thereby advancing precision therapy.


Assuntos
Neoplasias da Mama , Aprendizado Profundo , Fibroadenoma , Tumor Filoide , Médicos , Feminino , Humanos , Tumor Filoide/diagnóstico por imagem , Tumor Filoide/patologia , Estudos Retrospectivos , Fibroadenoma/diagnóstico por imagem , Fibroadenoma/patologia , Inteligência Artificial , Diagnóstico Diferencial , Neoplasias da Mama/diagnóstico por imagem
2.
Phys Med Biol ; 69(9)2024 Apr 17.
Artigo em Inglês | MEDLINE | ID: mdl-38537298

RESUMO

Objective.Accurate assessment of pleural line is crucial for the application of lung ultrasound (LUS) in monitoring lung diseases, thereby aim of this study is to develop a quantitative and qualitative analysis method for pleural line.Approach.The novel cascaded deep learning model based on convolution and multilayer perceptron was proposed to locate and segment the pleural line in LUS images, whose results were applied for quantitative analysis of textural and morphological features, respectively. By using gray-level co-occurrence matrix and self-designed statistical methods, eight textural and three morphological features were generated to characterize the pleural lines. Furthermore, the machine learning-based classifiers were employed to qualitatively evaluate the lesion degree of pleural line in LUS images.Main results.We prospectively evaluated 3770 LUS images acquired from 31 pneumonia patients. Experimental results demonstrated that the proposed pleural line extraction and evaluation methods all have good performance, with dice and accuracy of 0.87 and 94.47%, respectively, and the comparison with previous methods found statistical significance (P< 0.001 for all). Meanwhile, the generalization verification proved the feasibility of the proposed method in multiple data scenarios.Significance.The proposed method has great application potential for assessment of pleural line in LUS images and aiding lung disease diagnosis and treatment.


Assuntos
Pulmão , Pneumonia , Humanos , Pulmão/diagnóstico por imagem , Tórax , Ultrassonografia/métodos , Redes Neurais de Computação
SELEÇÃO DE REFERÊNCIAS
Detalhe da pesquisa