| Literature DB >> 32038850 |
Liang Qiu1, Changsheng Li1, Hongliang Ren1.
Abstract
Image-based surgical instrument tracking in robot-assisted surgery is an active and challenging research area. Having a real-time knowledge of surgical instrument location is an essential part of a computer-assisted intervention system. Tracking can be used as visual feedback for servo control of a surgical robot or transformed as haptic feedback for surgeon-robot interaction. In this Letter, the authors apply a multi-domain convolutional neural network for fast 2D surgical instrument tracking considering the application for multiple surgical tools and use a focal loss to decrease the effect of easy negative examples. They further introduce a new dataset based on m2cai16-tool and their cadaver experiments due to the lack of established public surgical tool tracking dataset despite significant progress in this field. Their method is evaluated on the introduced dataset and outperforms the state-of-the-art real-time trackers.Entities:
Keywords: active research area; challenging research area; computer-assisted intervention system; established public surgical tool tracking dataset; medical robotics; multidomain convolutional neural network; multiple surgical tools; neural nets; real-time knowledge; robot-assisted surgery; surgeon–robot interaction; surgery; surgical instrument location; surgical robot; time surgical instrument tracking
Year: 2019 PMID: 32038850 PMCID: PMC6945802 DOI: 10.1049/htl.2019.0068
Source DB: PubMed Journal: Healthc Technol Lett ISSN: 2053-3713
Fig. 1Our dataset is made based on cadaver experiment videos for transoral surgery and selected laparoscopic surgery videos in the m2cai16-tool dataset. The corresponding examples of surgical tools are shown here
Properties of our dataset
| Data sources | Surgical tool types | No. of videos | No. of frames |
|---|---|---|---|
| m2cai-tool-tracking | bipolar | 4 | |
| clipper | 2 | ||
| grasper | 3 | ||
| hook | 4 | ||
| irrigator | 4 | ||
| scissor | 4 | ||
| robot-assisted-tracking | grasper/electrocoagulator/sensor/monopolar eletrotomes | 7 | |
| total | 28 | 12,347 | |
| test set | 17 | 10,244 | |
| total | 45 | 22,591 |
Fig. 2Our robot-assisted surgery framework which exploits multi-domain CNN to track surgical tools with endoscopic images as input and surgical tool location as output. The output information with bounding boxes will be further utilised in the processing unit to provide 6D pose estimation, which will provide more benefits for surgical tool navigation
Fig. 3Precision and success plots using OPE
a, b Ablation study: our method compares with RT-MDNet and the corresponding version without instance embedding loss on our STT dataset
c–h Show quantitative results of six real-time trackers on m2cai-tool-tracking sub-dataset, robot-assisted-tracking sub-dataset and STT dataset
Fig. 4Success plots of six real-time trackers over eight tracking challenges
a Illumination variation
b Background clutter
c Deformation
d Occlusion
e In-plane rotation
f Scale variation
g Out-of-plane rotation
h Motion blur
Quantitative comparisons of six real-time trackers on STT dataset
| Trackers | SiamFC | DSST | BACF | ECO-HC | RT-MDNet | Ours |
|---|---|---|---|---|---|---|
| succ (%) | 25.1 | 46.6 | 56.4 | 59.1 | 53.8 | 61.5 |
| prec (%) | 48.6 | 45.9 | 65.1 | 65.9 | 61.0 | 67.5 |
| FPS | 19.4 | 11.5 | 9.9 | 32.6 | 13.9 | 14.0 |
Fig. 5Qualitative evaluation of six real-time trackers with example frames shows that our method outperforms the state-of-the-art on STT dataset