Chinedu Innocent Nwoye1, Didier Mutter2, Jacques Marescaux2, Nicolas Padoy3. 1. ICube, University of Strasbourg, CNRS, IHU, Strasbourg, France. nwoye.chinedu@gmail.com. 2. University Hospital of Strasbourg, IRCAD, IHU, Strasbourg, France. 3. ICube, University of Strasbourg, CNRS, IHU, Strasbourg, France.
Abstract
PURPOSE: Real-time surgical tool tracking is a core component of the future intelligent operating room (OR), because it is highly instrumental to analyze and understand the surgical activities. Current methods for surgical tool tracking in videos need to be trained on data in which the spatial positions of the tools are manually annotated. Generating such training data is difficult and time-consuming. Instead, we propose to use solely binary presence annotations to train a tool tracker for laparoscopic videos. METHODS: The proposed approach is composed of a CNN + Convolutional LSTM (ConvLSTM) neural network trained end to end, but weakly supervised on tool binary presence labels only. We use the ConvLSTM to model the temporal dependencies in the motion of the surgical tools and leverage its spatiotemporal ability to smooth the class peak activations in the localization heat maps (Lh-maps). RESULTS: We build a baseline tracker on top of the CNN model and demonstrate that our approach based on the ConvLSTM outperforms the baseline in tool presence detection, spatial localization, and motion tracking by over [Formula: see text], [Formula: see text], and [Formula: see text], respectively. CONCLUSIONS: In this paper, we demonstrate that binary presence labels are sufficient for training a deep learning tracking model using our proposed method. We also show that the ConvLSTM can leverage the spatiotemporal coherence of consecutive image frames across a surgical video to improve tool presence detection, spatial localization, and motion tracking.
PURPOSE: Real-time surgical tool tracking is a core component of the future intelligent operating room (OR), because it is highly instrumental to analyze and understand the surgical activities. Current methods for surgical tool tracking in videos need to be trained on data in which the spatial positions of the tools are manually annotated. Generating such training data is difficult and time-consuming. Instead, we propose to use solely binary presence annotations to train a tool tracker for laparoscopic videos. METHODS: The proposed approach is composed of a CNN + Convolutional LSTM (ConvLSTM) neural network trained end to end, but weakly supervised on tool binary presence labels only. We use the ConvLSTM to model the temporal dependencies in the motion of the surgical tools and leverage its spatiotemporal ability to smooth the class peak activations in the localization heat maps (Lh-maps). RESULTS: We build a baseline tracker on top of the CNN model and demonstrate that our approach based on the ConvLSTM outperforms the baseline in tool presence detection, spatial localization, and motion tracking by over [Formula: see text], [Formula: see text], and [Formula: see text], respectively. CONCLUSIONS: In this paper, we demonstrate that binary presence labels are sufficient for training a deep learning tracking model using our proposed method. We also show that the ConvLSTM can leverage the spatiotemporal coherence of consecutive image frames across a surgical video to improve tool presence detection, spatial localization, and motion tracking.
Authors: Luis C Garcia-Peraza-Herrera; Lucas Fidon; Claudia D'Ettorre; Danail Stoyanov; Tom Vercauteren; Sebastien Ourselin Journal: IEEE Trans Med Imaging Date: 2021-04-30 Impact factor: 10.048
Authors: Lauren R Kennedy-Metz; Pietro Mascagni; Antonio Torralba; Roger D Dias; Pietro Perona; Julie A Shah; Nicolas Padoy; Marco A Zenati Journal: IEEE Trans Med Robot Bionics Date: 2020-11-24
Authors: Florian Aspart; Jon L Bolmgren; Joël L Lavanchy; Guido Beldi; Michael S Woods; Nicolas Padoy; Enes Hosgor Journal: Int J Comput Assist Radiol Surg Date: 2021-07-23 Impact factor: 2.924
Authors: Ioannis Gkouzionis; Scarlet Nazarian; Michal Kawka; Ara Darzi; Nisha Patel; Christopher J Peters; Daniel S Elson Journal: J Biomed Opt Date: 2022-02 Impact factor: 3.758