Hirenkumar Nakawala1, Roberto Bianchi2, Laura Erica Pescatori3, Ottavio De Cobelli2, Giancarlo Ferrigno3, Elena De Momi3. 1. Department of Electronics, Information and Bioengineering (DEIB), Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133, Milan, Italy. hirenkumar.nakawala@polimi.it. 2. Department of Urology, European Institute of Oncology (IEO), Via Giuseppe Ripamonti, 435, 20141, Milan, Italy. 3. Department of Electronics, Information and Bioengineering (DEIB), Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133, Milan, Italy.
Abstract
PURPOSE: Surgical workflow recognition and context-aware systems could allow better decision making and surgical planning by providing the focused information, which may eventually enhance surgical outcomes. While current developments in computer-assisted surgical systems are mostly focused on recognizing surgical phases, they lack recognition of surgical workflow sequence and other contextual element, e.g., "Instruments." Our study proposes a hybrid approach, i.e., using deep learning and knowledge representation, to facilitate recognition of the surgical workflow. METHODS: We implemented "Deep-Onto" network, which is an ensemble of deep learning models and knowledge management tools, ontology and production rules. As a prototypical scenario, we chose robot-assisted partial nephrectomy (RAPN). We annotated RAPN videos with surgical entities, e.g., "Step" and so forth. We performed different experiments, including the inter-subject variability, to recognize surgical steps. The corresponding subsequent steps along with other surgical contexts, i.e., "Actions," "Phase" and "Instruments," were also recognized. RESULTS: The system was able to recognize 10 RAPN steps with the prevalence-weighted macro-average (PWMA) recall of 0.83, PWMA precision of 0.74, PWMA F1 score of 0.76, and the accuracy of 74.29% on 9 videos of RAPN. CONCLUSION: We found that the combined use of deep learning and knowledge representation techniques is a promising approach for the multi-level recognition of RAPN surgical workflow.
PURPOSE: Surgical workflow recognition and context-aware systems could allow better decision making and surgical planning by providing the focused information, which may eventually enhance surgical outcomes. While current developments in computer-assisted surgical systems are mostly focused on recognizing surgical phases, they lack recognition of surgical workflow sequence and other contextual element, e.g., "Instruments." Our study proposes a hybrid approach, i.e., using deep learning and knowledge representation, to facilitate recognition of the surgical workflow. METHODS: We implemented "Deep-Onto" network, which is an ensemble of deep learning models and knowledge management tools, ontology and production rules. As a prototypical scenario, we chose robot-assisted partial nephrectomy (RAPN). We annotated RAPN videos with surgical entities, e.g., "Step" and so forth. We performed different experiments, including the inter-subject variability, to recognize surgical steps. The corresponding subsequent steps along with other surgical contexts, i.e., "Actions," "Phase" and "Instruments," were also recognized. RESULTS: The system was able to recognize 10 RAPN steps with the prevalence-weighted macro-average (PWMA) recall of 0.83, PWMA precision of 0.74, PWMA F1 score of 0.76, and the accuracy of 74.29% on 9 videos of RAPN. CONCLUSION: We found that the combined use of deep learning and knowledge representation techniques is a promising approach for the multi-level recognition of RAPN surgical workflow.
Keywords:
Deep learning; Knowledge representation; Robot-assisted partial nephrectomy; Surgical workflow
Authors: Henry C Lin; Izhak Shafran; Todd E Murphy; Allison M Okamura; David D Yuh; Gregory D Hager Journal: Med Image Comput Comput Assist Interv Date: 2005
Authors: Jim C Hu; Eric Treat; Christopher P Filson; Ian McLaren; Siwei Xiong; Sevan Stepanian; Khaled S Hafez; Alon Z Weizer; James Porter Journal: Eur Urol Date: 2014-05-22 Impact factor: 20.096
Authors: E Jason Abel; Stephen H Culp; Matthew Meissner; Surena F Matin; Pheroze Tamboli; Christopher G Wood Journal: BJU Int Date: 2010-11 Impact factor: 5.588
Authors: J Ferlay; E Steliarova-Foucher; J Lortet-Tieulent; S Rosso; J W W Coebergh; H Comber; D Forman; F Bray Journal: Eur J Cancer Date: 2013-02-26 Impact factor: 9.162
Authors: Paolo Fiorini; Ken Y Goldberg; Yunhui Liu; Russell H Taylor Journal: Proc IEEE Inst Electr Electron Eng Date: 2022-06-23 Impact factor: 14.910
Authors: Stefano Puliatti; Ahmed Eissa; Enrico Checcucci; Pietro Piazza; Marco Amato; Stefania Ferretti; Simone Scarcella; Juan Gomez Rivas; Mark Taratkin; Josè Marenco; Ines Belenchon Rivero; Karl-Friedrich Kowalewski; Giovanni Cacciamani; Ahmed El-Sherbiny; Ahmed Zoeir; Abdelhamid M El-Bahnasy; Ruben De Groote; Alexandre Mottrie; Salvatore Micali Journal: Asian J Urol Date: 2022-06-01
Authors: Karl-Friedrich Kowalewski; Luisa Egen; Chanel E Fischetti; Stefano Puliatti; Gomez Rivas Juan; Mark Taratkin; Rivero Belenchon Ines; Marie Angela Sidoti Abate; Julia Mühlbauer; Frederik Wessels; Enrico Checcucci; Giovanni Cacciamani Journal: Asian J Urol Date: 2022-06-18