Your browser doesn't support javascript.
loading
Classification of Tumor Epithelium and Stroma by Exploiting Image Features Learned by Deep Convolutional Neural Networks.
Du, Yue; Zhang, Roy; Zargari, Abolfazl; Thai, Theresa C; Gunderson, Camille C; Moxley, Katherine M; Liu, Hong; Zheng, Bin; Qiu, Yuchen.
Afiliación
  • Du Y; School of Electrical and Computer Engineering, University of Oklahoma, Norman, OK, 73019, USA.
  • Zhang R; Department of Pathology, University of Oklahoma Health Sciences Center, Oklahoma City, OK, 73104, USA.
  • Zargari A; School of Electrical and Computer Engineering, University of Oklahoma, Norman, OK, 73019, USA.
  • Thai TC; Department of Radiology, University of Oklahoma Health Sciences Center, Oklahoma City, OK, 73104, USA.
  • Gunderson CC; Department of Obstetrics and Gynecology, University of Oklahoma Health Sciences Center, Oklahoma City, OK, 73104, USA.
  • Moxley KM; Department of Obstetrics and Gynecology, University of Oklahoma Health Sciences Center, Oklahoma City, OK, 73104, USA.
  • Liu H; School of Electrical and Computer Engineering, University of Oklahoma, Norman, OK, 73019, USA.
  • Zheng B; School of Electrical and Computer Engineering, University of Oklahoma, Norman, OK, 73019, USA.
  • Qiu Y; School of Electrical and Computer Engineering, University of Oklahoma, Norman, OK, 73019, USA. qiuyuchen@ou.edu.
Ann Biomed Eng ; 46(12): 1988-1999, 2018 Dec.
Article en En | MEDLINE | ID: mdl-30051247
The tumor-stroma ratio (TSR) reflected on hematoxylin and eosin (H&E)-stained histological images is a potential prognostic factor for survival. Automatic image processing techniques that allow for high-throughput and precise discrimination of tumor epithelium and stroma are required to elevate the prognostic significance of the TSR. As a variant of deep learning techniques, transfer learning leverages nature-images features learned by deep convolutional neural networks (CNNs) to relieve the requirement of deep CNNs for immense sample size when handling biomedical classification problems. Herein we studied different transfer learning strategies for accurately distinguishing epithelial and stromal regions of H&E-stained histological images acquired from either breast or ovarian cancer tissue. We compared the performance of important deep CNNs as either a feature extractor or as an architecture for fine-tuning with target images. Moreover, we addressed the current contradictory issue about whether the higher-level features would generalize worse than lower-level ones because they are more specific to the source-image domain. Under our experimental setting, the transfer learning approach achieved an accuracy of 90.2 (vs. 91.1 for fine tuning) with GoogLeNet, suggesting the feasibility of using it in assisting pathology-based binary classification problems. Our results also show that the superiority of the lower-level or the higher-level features over the other ones was determined by the architecture of deep CNNs.
Asunto(s)
Palabras clave

Texto completo: 1 Bases de datos: MEDLINE Asunto principal: Neoplasias Ováricas / Procesamiento de Imagen Asistido por Computador / Neoplasias de la Mama / Aprendizaje Profundo Tipo de estudio: Prognostic_studies Límite: Female / Humans Idioma: En Revista: Ann Biomed Eng Año: 2018 Tipo del documento: Article País de afiliación: Estados Unidos

Texto completo: 1 Bases de datos: MEDLINE Asunto principal: Neoplasias Ováricas / Procesamiento de Imagen Asistido por Computador / Neoplasias de la Mama / Aprendizaje Profundo Tipo de estudio: Prognostic_studies Límite: Female / Humans Idioma: En Revista: Ann Biomed Eng Año: 2018 Tipo del documento: Article País de afiliación: Estados Unidos