Your browser doesn't support javascript.
loading
BI-RADS-Based Classification of Mammographic Soft Tissue Opacities Using a Deep Convolutional Neural Network.
Sabani, Albin; Landsmann, Anna; Hejduk, Patryk; Schmidt, Cynthia; Marcon, Magda; Borkowski, Karol; Rossi, Cristina; Ciritsis, Alexander; Boss, Andreas.
Affiliation
  • Sabani A; Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, Switzerland.
  • Landsmann A; Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, Switzerland.
  • Hejduk P; Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, Switzerland.
  • Schmidt C; Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, Switzerland.
  • Marcon M; Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, Switzerland.
  • Borkowski K; Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, Switzerland.
  • Rossi C; Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, Switzerland.
  • Ciritsis A; Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, Switzerland.
  • Boss A; Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, University of Zurich, 8091 Zurich, Switzerland.
Diagnostics (Basel) ; 12(7)2022 Jun 28.
Article in En | MEDLINE | ID: mdl-35885470
ABSTRACT
The aim of this study was to investigate the potential of a machine learning algorithm to classify breast cancer solely by the presence of soft tissue opacities in mammograms, independent of other morphological features, using a deep convolutional neural network (dCNN). Soft tissue opacities were classified based on their radiological appearance using the ACR BI-RADS atlas. We included 1744 mammograms from 438 patients to create 7242 icons by manual labeling. The icons were sorted into three categories "no opacities" (BI-RADS 1), "probably benign opacities" (BI-RADS 2/3) and "suspicious opacities" (BI-RADS 4/5). A dCNN was trained (70% of data), validated (20%) and finally tested (10%). A sliding window approach was applied to create colored probability maps for visual impression. Diagnostic performance of the dCNN was compared to human readout by experienced radiologists on a "real-world" dataset. The accuracies of the models on the test dataset ranged between 73.8% and 89.8%. Compared to human readout, our dCNN achieved a higher specificity (100%, 95% CI 85.4-100%; reader 1 86.2%, 95% CI 67.4-95.5%; reader 2 79.3%, 95% CI 59.7-91.3%), and the sensitivity (84.0%, 95% CI 63.9-95.5%) was lower than that of human readers (reader 188.0%, 95% CI 67.4-95.4%; reader 288.0%, 95% CI 67.7-96.8%). In conclusion, a dCNN can be used for the automatic detection as well as the standardized and observer-independent classification of soft tissue opacities in mammograms independent of the presence of microcalcifications. Human decision making in accordance with the BI-RADS classification can be mimicked by artificial intelligence.
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Type of study: Guideline / Prognostic_studies Language: En Journal: Diagnostics (Basel) Year: 2022 Document type: Article Affiliation country: Switzerland

Full text: 1 Collection: 01-internacional Database: MEDLINE Type of study: Guideline / Prognostic_studies Language: En Journal: Diagnostics (Basel) Year: 2022 Document type: Article Affiliation country: Switzerland