Your browser doesn't support javascript.
loading
Fully automatic classification of automated breast ultrasound (ABUS) imaging according to BI-RADS using a deep convolutional neural network.
Hejduk, Patryk; Marcon, Magda; Unkelbach, Jan; Ciritsis, Alexander; Rossi, Cristina; Borkowski, Karol; Boss, Andreas.
Afiliação
  • Hejduk P; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland. patryk.hejduk@usz.ch.
  • Marcon M; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland.
  • Unkelbach J; Department of Radiation Oncology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland.
  • Ciritsis A; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland.
  • Rossi C; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland.
  • Borkowski K; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland.
  • Boss A; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland.
Eur Radiol ; 32(7): 4868-4878, 2022 Jul.
Article em En | MEDLINE | ID: mdl-35147776
ABSTRACT

PURPOSE:

The aim of this study was to develop and test a post-processing technique for detection and classification of lesions according to the BI-RADS atlas in automated breast ultrasound (ABUS) based on deep convolutional neural networks (dCNNs). METHODS AND MATERIALS In this retrospective study, 645 ABUS datasets from 113 patients were included; 55 patients had lesions classified as high malignancy probability. Lesions were categorized in BI-RADS 2 (no suspicion of malignancy), BI-RADS 3 (probability of malignancy < 3%), and BI-RADS 4/5 (probability of malignancy > 3%). A deep convolutional neural network was trained after data augmentation with images of lesions and normal breast tissue, and a sliding-window approach for lesion detection was implemented. The algorithm was applied to a test dataset containing 128 images and performance was compared with readings of 2 experienced radiologists.

RESULTS:

Results of calculations performed on single images showed accuracy of 79.7% and AUC of 0.91 [95% CI 0.85-0.96] in categorization according to BI-RADS. Moderate agreement between dCNN and ground truth has been achieved (κ 0.57 [95% CI 0.50-0.64]) what is comparable with human readers. Analysis of whole dataset improved categorization accuracy to 90.9% and AUC of 0.91 [95% CI 0.77-1.00], while achieving almost perfect agreement with ground truth (κ 0.82 [95% CI 0.69-0.95]), performing on par with human readers. Furthermore, the object localization technique allowed the detection of lesion position slice-wise.

CONCLUSIONS:

Our results show that a dCNN can be trained to detect and distinguish lesions in ABUS according to the BI-RADS classification with similar accuracy as experienced radiologists. KEY POINTS • A deep convolutional neural network (dCNN) was trained for classification of ABUS lesions according to the BI-RADS atlas. • A sliding-window approach allows accurate automatic detection and classification of lesions in ABUS examinations.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Neoplasias da Mama / Ultrassonografia Mamária Idioma: En Ano de publicação: 2022 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Neoplasias da Mama / Ultrassonografia Mamária Idioma: En Ano de publicação: 2022 Tipo de documento: Article