Your browser doesn't support javascript.
loading
Automatic Detection of Post-Operative Clips in Mammography Using a U-Net Convolutional Neural Network.
Schnitzler, Tician; Ruppert, Carlotta; Hejduk, Patryk; Borkowski, Karol; Kajüter, Jonas; Rossi, Cristina; Ciritsis, Alexander; Landsmann, Anna; Zaytoun, Hasan; Boss, Andreas; Schindera, Sebastian; Burn, Felice.
Afiliación
  • Schnitzler T; Institute of Radiology, Cantonal Hospital Aarau, 5001 Aarau, Switzerland.
  • Ruppert C; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, 8091 Zurich, Switzerland.
  • Hejduk P; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, 8091 Zurich, Switzerland.
  • Borkowski K; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, 8091 Zurich, Switzerland.
  • Kajüter J; Institute of Diagnostic and Interventional Radiology, University Hospital Basel, 4031 Basel, Switzerland.
  • Rossi C; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, 8091 Zurich, Switzerland.
  • Ciritsis A; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, 8091 Zurich, Switzerland.
  • Landsmann A; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, 8091 Zurich, Switzerland.
  • Zaytoun H; Institute of Radiology, Cantonal Hospital Aarau, 5001 Aarau, Switzerland.
  • Boss A; Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, 8091 Zurich, Switzerland.
  • Schindera S; Institute of Radiology, Cantonal Hospital Aarau, 5001 Aarau, Switzerland.
  • Burn F; Institute of Radiology, Cantonal Hospital Aarau, 5001 Aarau, Switzerland.
J Imaging ; 10(6)2024 Jun 19.
Article en En | MEDLINE | ID: mdl-38921624
ABSTRACT

BACKGROUND:

After breast conserving surgery (BCS), surgical clips indicate the tumor bed and, thereby, the most probable area for tumor relapse. The aim of this study was to investigate whether a U-Net-based deep convolutional neural network (dCNN) may be used to detect surgical clips in follow-up mammograms after BCS.

METHODS:

884 mammograms and 517 tomosynthetic images depicting surgical clips and calcifications were manually segmented and classified. A U-Net-based segmentation network was trained with 922 images and validated with 394 images. An external test dataset consisting of 39 images was annotated by two radiologists with up to 7 years of experience in breast imaging. The network's performance was compared to that of human readers using accuracy and interrater agreement (Cohen's Kappa).

RESULTS:

The overall classification accuracy on the validation set after 45 epochs ranged between 88.2% and 92.6%, indicating that the model's performance is comparable to the decisions of a human reader. In 17.4% of cases, calcifications have been misclassified as post-operative clips. The interrater reliability of the model compared to the radiologists showed substantial agreement (κreader1 = 0.72, κreader2 = 0.78) while the readers compared to each other revealed a Cohen's Kappa of 0.84, thus showing near-perfect agreement.

CONCLUSIONS:

With this study, we show that surgery clips can adequately be identified by an AI technique. A potential application of the proposed technique is patient triage as well as the automatic exclusion of post-operative cases from PGMI (Perfect, Good, Moderate, Inadequate) evaluation, thus improving the quality management workflow.
Palabras clave

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: J Imaging Año: 2024 Tipo del documento: Article País de afiliación: Suiza

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: J Imaging Año: 2024 Tipo del documento: Article País de afiliación: Suiza