Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros

Banco de datos
Tipo del documento
Intervalo de año de publicación
1.
Biomed Eng Online ; 20(1): 112, 2021 Nov 18.
Artículo en Inglés | MEDLINE | ID: mdl-34794443

RESUMEN

BACKGROUND: The rapid development of artificial intelligence technology has improved the capability of automatic breast cancer diagnosis, compared to traditional machine learning methods. Convolutional Neural Network (CNN) can automatically select high efficiency features, which helps to improve the level of computer-aided diagnosis (CAD). It can improve the performance of distinguishing benign and malignant breast ultrasound (BUS) tumor images, making rapid breast tumor screening possible. RESULTS: The classification model was evaluated with a different dataset of 100 BUS tumor images (50 benign cases and 50 malignant cases), which was not used in network training. Evaluation indicators include accuracy, sensitivity, specificity, and area under curve (AUC) value. The results in the Fus2Net model had an accuracy of 92%, the sensitivity reached 95.65%, the specificity reached 88.89%, and the AUC value reached 0.97 for classifying BUS tumor images. CONCLUSIONS: The experiment compared the existing CNN-categorized architecture, and the Fus2Net architecture we customed has more advantages in a comprehensive performance. The obtained results demonstrated that the Fus2Net classification method we proposed can better assist radiologists in the diagnosis of benign and malignant BUS tumor images. METHODS: The existing public datasets are small and the amount of data suffer from the balance issue. In this paper, we provide a relatively larger dataset with a total of 1052 ultrasound images, including 696 benign images and 356 malignant images, which were collected from a local hospital. We proposed a novel CNN named Fus2Net for the benign and malignant classification of BUS tumor images and it contains two self-designed feature extraction modules. To evaluate how the classifier generalizes on the experimental dataset, we employed the training set (646 benign cases and 306 malignant cases) for tenfold cross-validation. Meanwhile, to solve the balance of the dataset, the training data were augmented before being fed into the Fus2Net. In the experiment, we used hyperparameter fine-tuning and regularization technology to make the Fus2Net convergence.


Asunto(s)
Inteligencia Artificial , Neoplasias de la Mama , Neoplasias de la Mama/diagnóstico por imagen , Femenino , Humanos , Aprendizaje Automático , Redes Neurales de la Computación , Ultrasonografía Mamaria
2.
J Imaging Inform Med ; 2024 Feb 21.
Artículo en Inglés | MEDLINE | ID: mdl-38381383

RESUMEN

The purpose of this study was to fuse conventional radiomic and deep features from digital breast tomosynthesis craniocaudal projection (DBT-CC) and ultrasound (US) images to establish a multimodal benign-malignant classification model and evaluate its clinical value. Data were obtained from a total of 487 patients at three centers, each of whom underwent DBT-CC and US examinations. A total of 322 patients from dataset 1 were used to construct the model, while 165 patients from datasets 2 and 3 formed the prospective testing cohort. Two radiologists with 10-20 years of work experience and three sonographers with 12-20 years of work experience semiautomatically segmented the lesions using ITK-SNAP software while considering the surrounding tissue. For the experiments, we extracted conventional radiomic and deep features from tumors from DBT-CCs and US images using PyRadiomics and Inception-v3. Additionally, we extracted conventional radiomic features from four peritumoral layers around the tumors via DBT-CC and US images. Features were fused separately from the intratumoral and peritumoral regions. For the models, we tested the SVM, KNN, decision tree, RF, XGBoost, and LightGBM classifiers. Early fusion and late fusion (ensemble and stacking) strategies were employed for feature fusion. Using the SVM classifier, stacking fusion of deep features and three peritumoral radiomic features from tumors in DBT-CC and US images achieved the optimal performance, with an accuracy and AUC of 0.953 and 0.959 [CI: 0.886-0.996], a sensitivity and specificity of 0.952 [CI: 0.888-0.992] and 0.955 [0.868-0.985], and a precision of 0.976. The experimental results indicate that the fusion model of deep features and peritumoral radiomic features from tumors in DBT-CC and US images shows promise in differentiating benign and malignant breast tumors.

3.
Med Eng Phys ; 125: 104117, 2024 03.
Artículo en Inglés | MEDLINE | ID: mdl-38508797

RESUMEN

This study aims to establish an effective benign and malignant classification model for breast tumor ultrasound images by using conventional radiomics and transfer learning features. We collaborated with a local hospital and collected a base dataset (Dataset A) consisting of 1050 cases of single lesion 2D ultrasound images from patients, with a total of 593 benign and 357 malignant tumor cases. The experimental approach comprises three main parts: conventional radiomics, transfer learning, and feature fusion. Furthermore, we assessed the model's generalizability by utilizing multicenter data obtained from Datasets B and C. The results from conventional radiomics indicated that the SVM classifier achieved the highest balanced accuracy of 0.791, while XGBoost obtained the highest AUC of 0.854. For transfer learning, we extracted deep features from ResNet50, Inception-v3, DenseNet121, MNASNet, and MobileNet. Among these models, MNASNet, with 640-dimensional deep features, yielded the optimal performance, with a balanced accuracy of 0.866, AUC of 0.937, sensitivity of 0.819, and specificity of 0.913. In the feature fusion phase, we trained SVM, ExtraTrees, XGBoost, and LightGBM with early fusion features and evaluated them with weighted voting. This approach achieved the highest balanced accuracy of 0.964 and AUC of 0.981. Combining conventional radiomics and transfer learning features demonstrated clear advantages over using individual features for breast tumor ultrasound image classification. This automated diagnostic model can ease patient burden and provide additional diagnostic support to radiologists. The performance of this model encourages future prospective research in this domain.


Asunto(s)
Neoplasias de la Mama , Radiómica , Humanos , Femenino , Estudios Retrospectivos , Ultrasonografía Mamaria , Aprendizaje Automático , Neoplasias de la Mama/diagnóstico por imagen
4.
Phys Eng Sci Med ; 46(3): 995-1013, 2023 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-37195403

RESUMEN

Breast and thyroid cancers are the two most common cancers among women worldwide. The early clinical diagnosis of breast and thyroid cancers often utilizes ultrasonography. Most of the ultrasound images of breast and thyroid cancer lack specificity, which reduces the accuracy of ultrasound clinical diagnosis. This study attempts to develop an effective convolutional neural network (E-CNN) for the classification of benign and malignant breast and thyroid tumors from ultrasound images. The 2-Dimension (2D) ultrasound images of 1052 breast tumors were collected, and 8245 2D tumor images were obtained from 76 thyroid cases. We performed tenfold cross-validation on breast and thyroid data, with a mean classification accuracy of 0.932 and 0.902, respectively. In addition, the proposed E-CNN was applied to classify and evaluate 9297 mixed images (breast and thyroid images). The mean classification accuracy was 0.875, and the mean area under the curve (AUC) was 0.955. Based on data in the same modality, we transferred the breast model to classify typical tumor images of 76 patients. The finetuning model achieved a mean classification accuracy of 0.945, and a mean AUC of 0.958. Meanwhile, the transfer thyroid model realized a mean classification accuracy of 0.932, and a mean AUC of 0.959, on 1052 breast tumor images. The experimental results demonstrate the ability of the E-CNN to learn the features and classify breast and thyroid tumors. Besides, it is promising to classify benign and malignant tumors from ultrasound images with the transfer model under the same modality.


Asunto(s)
Mama , Neoplasias de la Tiroides , Humanos , Femenino , Mama/diagnóstico por imagen , Redes Neurales de la Computación , Ultrasonografía , Diagnóstico por Computador
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA