Your browser doesn't support javascript.
loading
Application of convolutional neural networks for evaluating the depth of invasion of early gastric cancer based on endoscopic images.
Hamada, Kenta; Kawahara, Yoshiro; Tanimoto, Takayoshi; Ohto, Akimitsu; Toda, Akira; Aida, Toshiaki; Yamasaki, Yasushi; Gotoda, Tatsuhiro; Ogawa, Taiji; Abe, Makoto; Okanoue, Shotaro; Takei, Kensuke; Kikuchi, Satoru; Kuroda, Shinji; Fujiwara, Toshiyoshi; Okada, Hiroyuki.
Afiliação
  • Hamada K; Department of Endoscopy, Okayama University Hospital, Okayama, Japan.
  • Kawahara Y; Department of Practical Gastrointestinal Endoscopy, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
  • Tanimoto T; Department of Practical Gastrointestinal Endoscopy, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
  • Ohto A; Business Strategy Division, Ryobi Systems Co., Ltd., Okayama, Japan.
  • Toda A; Health Care Company, Ryobi Systems Co., Ltd., Okayama, Japan.
  • Aida T; Business Strategy Division, Ryobi Systems Co., Ltd., Okayama, Japan.
  • Yamasaki Y; Okayama University Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama, Japan.
  • Gotoda T; Department of Gastroenterology and Hepatology, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
  • Ogawa T; Department of Gastroenterology and Hepatology, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
  • Abe M; Department of Gastroenterology and Hepatology, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
  • Okanoue S; Department of Gastroenterology and Hepatology, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
  • Takei K; Department of Gastroenterology and Hepatology, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
  • Kikuchi S; Department of Gastroenterology and Hepatology, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
  • Kuroda S; Department of Gastroenterological Surgery, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
  • Fujiwara T; Department of Gastroenterological Surgery, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
  • Okada H; Department of Gastroenterological Surgery, Okayama University Graduate School of Medicine, Dentistry and Pharmaceutical Sciences, Okayama, Japan.
J Gastroenterol Hepatol ; 37(2): 352-357, 2022 Feb.
Article em En | MEDLINE | ID: mdl-34713495
ABSTRACT
BACKGROUND AND

AIM:

Recently, artificial intelligence (AI) has been used in endoscopic examination and is expected to help in endoscopic diagnosis. We evaluated the feasibility of AI using convolutional neural network (CNN) systems for evaluating the depth of invasion of early gastric cancer (EGC), based on endoscopic images.

METHODS:

This study used a deep CNN model, ResNet152. From patients who underwent treatment for EGC at our hospital between January 2012 and December 2016, we selected 100 consecutive patients with mucosal (M) cancers and 100 consecutive patients with cancers invading the submucosa (SM cancers). A total of 3508 non-magnifying endoscopic images of EGCs, including white-light imaging, linked color imaging, blue laser imaging-bright, and indigo-carmine dye contrast imaging, were included in this study. A total of 2288 images from 132 patients served as the development dataset, and 1220 images from 68 patients served as the testing dataset. Invasion depth was evaluated for each image and lesion. The majority vote was applied to lesion-based evaluation.

RESULTS:

The sensitivity, specificity, and accuracy for diagnosing M cancer were 84.9% (95% confidence interval [CI] 82.3%-87.5%), 70.7% (95% CI 66.8%-74.6%), and 78.9% (95% CI 76.6%-81.2%), respectively, for image-based evaluation, and 85.3% (95% CI 73.4%-97.2%), 82.4% (95% CI 69.5%-95.2%), and 83.8% (95% CI 75.1%-92.6%), respectively, for lesion-based evaluation.

CONCLUSIONS:

The application of AI using CNN to evaluate the depth of invasion of EGCs based on endoscopic images is feasible, and it is worth investing more effort to put this new technology into practical use.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Neoplasias Gástricas / Redes Neurais de Computação / Detecção Precoce de Câncer Idioma: En Ano de publicação: 2022 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Neoplasias Gástricas / Redes Neurais de Computação / Detecção Precoce de Câncer Idioma: En Ano de publicação: 2022 Tipo de documento: Article