Literature DB >> 35266089

Lightweight Deep Neural Network for Articulated Joint Detection of Surgical Instrument in Minimally Invasive Surgical Robot.

Yanwen Sun1, Bo Pan2, Yili Fu1.   

Abstract

Vision-based detection and tracking of surgical instrument are attractive because it relies purely on surgical instrument already in the operating scenario. The vision knowledge of the surgical instruments is a crucial piece of topic for surgical task understanding, autonomous robot control and human-robot collaborative surgeries to enhance surgical outcomes. In this work, a novel method has been demonstrated by developing a multitask lightweight deep neural network framework to explore surgical instrument articulated joint detection. The model has an end-to-end architecture with two branches, which share the same high-level visual features provided by a lightweight backbone while holding respective layers targeting for specific tasks. We have designed a novel subnetwork with joint detection branch and an instrument classification branch to sufficiently take advantage of the relatedness of surgical instrument presence detection and surgical instrument articulated joint detection tasks. The lightweight joint detection branch has been employed to efficiently locate the articulated joint position with simultaneously holding low computational cost. Moreover, the surgical instrument classification branch is introduced to boost the performance of joint detection. The two branches are merged to output the articulated joint location with respective instrument type. Extensive validation has been conducted to evaluate the proposed method. The results demonstrate promising performance of our proposed method. The work represents the feasibility to perform real-time surgical instrument articulated joint detection by taking advantage of the components of surgical robot system, contributing to the reference for further surgical intelligence.
© 2022. The Author(s) under exclusive licence to Society for Imaging Informatics in Medicine.

Entities:  

Keywords:  Articulated joint detection; Convolutional neural network; Surgical instrument; Surgical robot

Mesh:

Year:  2022        PMID: 35266089      PMCID: PMC9485358          DOI: 10.1007/s10278-022-00616-9

Source DB:  PubMed          Journal:  J Digit Imaging        ISSN: 0897-1889            Impact factor:   4.903


  6 in total

Review 1.  Evolution of autonomous and semi-autonomous robotic surgical systems: a review of the literature.

Authors:  G P Moustris; S C Hiridis; K M Deliparaschos; K M Konstantinidis
Journal:  Int J Med Robot       Date:  2011-08-03       Impact factor: 2.547

2.  Real-Time Multi-Guidewire Endpoint Localization in Fluoroscopy Images.

Authors:  Rui-Qi Li; Xiao-Liang Xie; Xiao-Hu Zhou; Shi-Qi Liu; Zhen-Liang Ni; Yan-Jie Zhou; Gui-Bin Bian; Zeng-Guang Hou
Journal:  IEEE Trans Med Imaging       Date:  2021-07-30       Impact factor: 10.048

3.  Toward detection and localization of instruments in minimally invasive surgery.

Authors:  Max Allan; Sébastien Ourselin; Steve Thompson; David J Hawkes; John Kelly; Danail Stoyanov
Journal:  IEEE Trans Biomed Eng       Date:  2012-11-21       Impact factor: 4.538

4.  Detection and Localization of Robotic Tools in Robot-Assisted Surgery Videos Using Deep Neural Networks for Region Proposal and Detection.

Authors:  Duygu Sarikaya; Jason J Corso; Khurshid A Guru
Journal:  IEEE Trans Med Imaging       Date:  2017-02-08       Impact factor: 10.048

5.  3-D Pose Estimation of Articulated Instruments in Robotic Minimally Invasive Surgery.

Authors:  M Allan; S Ourselin; D J Hawkes; J D Kelly; D Stoyanov
Journal:  IEEE Trans Med Imaging       Date:  2018-05       Impact factor: 10.048

6.  Articulated Multi-Instrument 2-D Pose Estimation Using Fully Convolutional Networks.

Authors:  Xiaofei Du; Thomas Kurmann; Ping-Lin Chang; Maximilian Allan; Sebastien Ourselin; Raphael Sznitman; John D Kelly; Danail Stoyanov
Journal:  IEEE Trans Med Imaging       Date:  2018-05       Impact factor: 10.048

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.