Your browser doesn't support javascript.
loading
SurgT challenge: Benchmark of soft-tissue trackers for robotic surgery.
Cartucho, João; Weld, Alistair; Tukra, Samyakh; Xu, Haozheng; Matsuzaki, Hiroki; Ishikawa, Taiyo; Kwon, Minjun; Jang, Yong Eun; Kim, Kwang-Ju; Lee, Gwang; Bai, Bizhe; Kahrs, Lueder A; Boecking, Lars; Allmendinger, Simeon; Müller, Leopold; Zhang, Yitong; Jin, Yueming; Bano, Sophia; Vasconcelos, Francisco; Reiter, Wolfgang; Hajek, Jonas; Silva, Bruno; Lima, Estevão; Vilaça, João L; Queirós, Sandro; Giannarou, Stamatia.
Afiliación
  • Cartucho J; The Hamlyn Centre for Robotic Surgery, Imperial College London, United Kingdom. Electronic address: jmc19@ic.ac.uk.
  • Weld A; The Hamlyn Centre for Robotic Surgery, Imperial College London, United Kingdom.
  • Tukra S; The Hamlyn Centre for Robotic Surgery, Imperial College London, United Kingdom.
  • Xu H; The Hamlyn Centre for Robotic Surgery, Imperial College London, United Kingdom.
  • Matsuzaki H; Jmees, Japan.
  • Ishikawa T; Jmees, Japan.
  • Kwon M; Electronics and Telecommunications Research Institute (ETRI), Daejeon, South Korea.
  • Jang YE; Electronics and Telecommunications Research Institute (ETRI), Daejeon, South Korea.
  • Kim KJ; Electronics and Telecommunications Research Institute (ETRI), Daejeon, South Korea.
  • Lee G; Ajou University, Gyeonggi-do, South Korea.
  • Bai B; Medical Computer Vision and Robotics Lab, University of Toronto, Canada.
  • Kahrs LA; Medical Computer Vision and Robotics Lab, University of Toronto, Canada.
  • Boecking L; Karlsruher Institut für Technologie: (KIT), Germany.
  • Allmendinger S; Karlsruher Institut für Technologie: (KIT), Germany.
  • Müller L; Karlsruher Institut für Technologie: (KIT), Germany.
  • Zhang Y; Surgical Robot Vision, University College London, United Kingdom.
  • Jin Y; Surgical Robot Vision, University College London, United Kingdom.
  • Bano S; Surgical Robot Vision, University College London, United Kingdom.
  • Vasconcelos F; Surgical Robot Vision, University College London, United Kingdom.
  • Reiter W; RIWOlink GmbH, Munich, Germany.
  • Hajek J; RIWOlink GmbH, Munich, Germany.
  • Silva B; Life and Health Sciences Research Institute (ICVS), School of Medicine, University of Minho, Braga, Portugal; ICVS/3B's - PT Government Associate Laboratory, Braga/Guimarães, Portugal; 2Ai - School of Technology, IPCA, Barcelos, Portugal.
  • Lima E; Life and Health Sciences Research Institute (ICVS), School of Medicine, University of Minho, Braga, Portugal; ICVS/3B's - PT Government Associate Laboratory, Braga/Guimarães, Portugal.
  • Vilaça JL; 2Ai - School of Technology, IPCA, Barcelos, Portugal.
  • Queirós S; Life and Health Sciences Research Institute (ICVS), School of Medicine, University of Minho, Braga, Portugal; ICVS/3B's - PT Government Associate Laboratory, Braga/Guimarães, Portugal.
  • Giannarou S; The Hamlyn Centre for Robotic Surgery, Imperial College London, United Kingdom.
Med Image Anal ; 91: 102985, 2024 Jan.
Article en En | MEDLINE | ID: mdl-37844472
ABSTRACT
This paper introduces the "SurgT Surgical Tracking" challenge which was organized in conjunction with the 25th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2022). There were two purposes for the creation of this challenge (1) the establishment of the first standardized benchmark for the research community to assess soft-tissue trackers; and (2) to encourage the development of unsupervised deep learning methods, given the lack of annotated data in surgery. A dataset of 157 stereo endoscopic videos from 20 clinical cases, along with stereo camera calibration parameters, have been provided. Participants were assigned the task of developing algorithms to track the movement of soft tissues, represented by bounding boxes, in stereo endoscopic videos. At the end of the challenge, the developed methods were assessed on a previously hidden test subset. This assessment uses benchmarking metrics that were purposely developed for this challenge, to verify the efficacy of unsupervised deep learning algorithms in tracking soft-tissue. The metric used for ranking the methods was the Expected Average Overlap (EAO) score, which measures the average overlap between a tracker's and the ground truth bounding boxes. Coming first in the challenge was the deep learning submission by ICVS-2Ai with a superior EAO score of 0.617. This method employs ARFlow to estimate unsupervised dense optical flow from cropped images, using photometric and regularization losses. Second, Jmees with an EAO of 0.583, uses deep learning for surgical tool segmentation on top of a non-deep learning baseline

method:

CSRT. CSRT by itself scores a similar EAO of 0.563. The results from this challenge show that currently, non-deep learning methods are still competitive. The dataset and benchmarking tool created for this challenge have been made publicly available at https//surgt.grand-challenge.org/. This challenge is expected to contribute to the development of autonomous robotic surgery and other digital surgical technologies.
Asunto(s)
Palabras clave

Texto completo: 1 Bases de datos: MEDLINE Asunto principal: Procedimientos Quirúrgicos Robotizados Límite: Humans Idioma: En Revista: Med Image Anal Asunto de la revista: DIAGNOSTICO POR IMAGEM Año: 2024 Tipo del documento: Article

Texto completo: 1 Bases de datos: MEDLINE Asunto principal: Procedimientos Quirúrgicos Robotizados Límite: Humans Idioma: En Revista: Med Image Anal Asunto de la revista: DIAGNOSTICO POR IMAGEM Año: 2024 Tipo del documento: Article