Your browser doesn't support javascript.
loading
Validating Automatic Concept-Based Explanations for AI-Based Digital Histopathology.
Sauter, Daniel; Lodde, Georg; Nensa, Felix; Schadendorf, Dirk; Livingstone, Elisabeth; Kukuk, Markus.
Afiliação
  • Sauter D; Department of Computer Science, Fachhochschule Dortmund, 44227 Dortmund, Germany.
  • Lodde G; Department of Dermatology, University Hospital Essen, 45147 Essen, Germany.
  • Nensa F; Institute for AI in Medicine (IKIM), University Hospital Essen, 45131 Essen, Germany.
  • Schadendorf D; Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, 45147 Essen, Germany.
  • Livingstone E; Department of Dermatology, University Hospital Essen, 45147 Essen, Germany.
  • Kukuk M; Department of Dermatology, University Hospital Essen, 45147 Essen, Germany.
Sensors (Basel) ; 22(14)2022 Jul 18.
Article em En | MEDLINE | ID: mdl-35891026
ABSTRACT
Digital histopathology poses several challenges such as label noise, class imbalance, limited availability of labelled data, and several latent biases to deep learning, negatively influencing transparency, reproducibility, and classification performance. In particular, biases are well known to cause poor generalization. Proposed tools from explainable artificial intelligence (XAI), bias detection, and bias discovery suffer from technical challenges, complexity, unintuitive usage, inherent biases, or a semantic gap. A promising XAI method, not studied in the context of digital histopathology is automated concept-based explanation (ACE). It automatically extracts visual concepts from image data. Our objective is to evaluate ACE's technical validity following design science principals and to compare it to Guided Gradient-weighted Class Activation Mapping (Grad-CAM), a conventional pixel-wise explanation method. To that extent, we created and studied five convolutional neural networks (CNNs) in four different skin cancer settings. Our results demonstrate that ACE is a valid tool for gaining insights into the decision process of histopathological CNNs that can go beyond explanations from the control method. ACE validly visualized a class sampling ratio bias, measurement bias, sampling bias, and class-correlated bias. Furthermore, the complementary use with Guided Grad-CAM offers several benefits. Finally, we propose practical solutions for several technical challenges. In contradiction to results from the literature, we noticed lower intuitiveness in some dermatopathology scenarios as compared to concept-based explanations on real-world images.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Neoplasias Cutâneas / Inteligência Artificial Tipo de estudo: Prognostic_studies Limite: Humans Idioma: En Revista: Sensors (Basel) Ano de publicação: 2022 Tipo de documento: Article País de afiliação: Alemanha

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Neoplasias Cutâneas / Inteligência Artificial Tipo de estudo: Prognostic_studies Limite: Humans Idioma: En Revista: Sensors (Basel) Ano de publicação: 2022 Tipo de documento: Article País de afiliação: Alemanha