Your browser doesn't support javascript.
loading
An immunofluorescence-guided segmentation model in H&E images is enabled by tissue artifact correction by CycleGAN.
Wiedenmann, Marcel; Barch, Mariya; Chang, Patrick S; Giltnane, Jennifer; Risom, Tyler; Zijlstra, Andries.
Afiliação
  • Wiedenmann M; Department of Computer and Information Science, University of Konstanz, Universitätsstraße 10, 78464 Konstanz, Germany. Electronic address: marcelwiedenmann@gmail.com.
  • Barch M; Department of Research Pathology, Genentech, Inc., 1 DNA Way, South San Francisco, 94080, CA, USA. Electronic address: barchm1@gene.com.
  • Chang PS; Department of Research Pathology, Genentech, Inc., 1 DNA Way, South San Francisco, 94080, CA, USA. Electronic address: changp@gene.com.
  • Giltnane J; Department of Research Pathology, Genentech, Inc., 1 DNA Way, South San Francisco, 94080, CA, USA. Electronic address: giltnanj@gene.com.
  • Risom T; Department of Research Pathology, Genentech, Inc., 1 DNA Way, South San Francisco, 94080, CA, USA. Electronic address: risomt@gene.com.
  • Zijlstra A; Department of Pathology, Microbiology and Immunology, Vanderbilt University Medical Center, Nashville, TN, USA. Electronic address: andries.zijlstra@gmail.com.
Mod Pathol ; : 100591, 2024 Aug 13.
Article em En | MEDLINE | ID: mdl-39147031
ABSTRACT
Despite recent advances, the adoption of computer vision methods into clinical and commercial applications has been hampered by the limited availability of accurate ground truth tissue annotations required to train robust supervised models. Generating such ground truth can be accelerated by annotating tissue molecularly using immunofluorescence staining (IF) and mapping these annotations to a post-IF H&E (terminal H&E). Mapping the annotations between the IF and the terminal H&E increases both the scale and accuracy by which ground truth could be generated. However, discrepancies between terminal H&E and conventional H&E caused by IF tissue processing have limited this implementation. We sought to overcome this challenge and achieve compatibility between these parallel modalities using synthetic image generation, in which a cycle-consistent generative adversarial network (CycleGAN) was applied to transfer the appearance of conventional H&E such that it emulates the terminal H&E. These synthetic emulations allowed us to train a deep learning (DL) model for the segmentation of epithelium in the terminal H&E that could be validated against the IF staining of epithelial-based cytokeratins. The combination of this segmentation model with the CycleGAN stain transfer model enabled performative epithelium segmentation in conventional H&E images. The approach demonstrates that the training of accurate segmentation models for the breadth of conventional H&E data can be executed free of human-expert annotations by leveraging molecular annotation strategies such as IF, so long as the tissue impacts of the molecular annotation protocol are captured by generative models that can be deployed prior to the segmentation process.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Mod Pathol Assunto da revista: PATOLOGIA Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Mod Pathol Assunto da revista: PATOLOGIA Ano de publicação: 2024 Tipo de documento: Article