Your browser doesn't support javascript.
loading
Towards multimodal graph neural networks for surgical instrument anticipation.
Wagner, Lars; Schneider, Dennis N; Mayer, Leon; Jell, Alissa; Müller, Carolin; Lenz, Alexander; Knoll, Alois; Wilhelm, Dirk.
Afiliación
  • Wagner L; Technical University of Munich, TUM School of Medicine and Health, Klinikum rechts der Isar, Research Group MITI, Munich, Germany. lars.wagner@tum.de.
  • Schneider DN; Technical University of Munich, TUM School of Medicine and Health, Klinikum rechts der Isar, Research Group MITI, Munich, Germany.
  • Mayer L; Technical University of Munich, TUM School of Medicine and Health, Klinikum rechts der Isar, Research Group MITI, Munich, Germany.
  • Jell A; Technical University of Munich, TUM School of Medicine and Health, Klinikum rechts der Isar, Research Group MITI, Munich, Germany.
  • Müller C; Technical University of Munich, TUM School of Medicine and Health, Klinikum rechts der Isar, Department of Surgery, Munich, Germany.
  • Lenz A; Technical University of Munich, TUM School of Medicine and Health, Klinikum rechts der Isar, Research Group MITI, Munich, Germany.
  • Knoll A; Technical University of Munich, TUM School of Computation, Information and Technology, Chair of Robotics, Artificial Intelligence and Real-Time Systems, Munich, Germany.
  • Wilhelm D; Technical University of Munich, TUM School of Computation, Information and Technology, Chair of Robotics, Artificial Intelligence and Real-Time Systems, Munich, Germany.
Int J Comput Assist Radiol Surg ; 19(10): 1929-1937, 2024 Oct.
Article en En | MEDLINE | ID: mdl-38985412
ABSTRACT

PURPOSE:

Decision support systems and context-aware assistance in the operating room have emerged as the key clinical applications supporting surgeons in their daily work and are generally based on single modalities. The model- and knowledge-based integration of multimodal data as a basis for decision support systems that can dynamically adapt to the surgical workflow has not yet been established. Therefore, we propose a knowledge-enhanced method for fusing multimodal data for anticipation tasks.

METHODS:

We developed a holistic, multimodal graph-based approach combining imaging and non-imaging information in a knowledge graph representing the intraoperative scene of a surgery. Node and edge features of the knowledge graph are extracted from suitable data sources in the operating room using machine learning. A spatiotemporal graph neural network architecture subsequently allows for interpretation of relational and temporal patterns within the knowledge graph. We apply our approach to the downstream task of instrument anticipation while presenting a suitable modeling and evaluation strategy for this task.

RESULTS:

Our approach achieves an F1 score of 66.86% in terms of instrument anticipation, allowing for a seamless surgical workflow and adding a valuable impact for surgical decision support systems. A resting recall of 63.33% indicates the non-prematurity of the anticipations.

CONCLUSION:

This work shows how multimodal data can be combined with the topological properties of an operating room in a graph-based approach. Our multimodal graph architecture serves as a basis for context-sensitive decision support systems in laparoscopic surgery considering a comprehensive intraoperative operating scene.
Asunto(s)
Palabras clave

Texto completo: 1 Base de datos: MEDLINE Asunto principal: Redes Neurales de la Computación Límite: Humans Idioma: En Revista: Int J Comput Assist Radiol Surg Asunto de la revista: RADIOLOGIA Año: 2024 Tipo del documento: Article País de afiliación: Alemania

Texto completo: 1 Base de datos: MEDLINE Asunto principal: Redes Neurales de la Computación Límite: Humans Idioma: En Revista: Int J Comput Assist Radiol Surg Asunto de la revista: RADIOLOGIA Año: 2024 Tipo del documento: Article País de afiliación: Alemania