| Literature DB >> 27237604 |
Nicola Rieke1, David Joseph Tan2, Chiara Amat di San Filippo3, Federico Tombari4, Mohamed Alsheakhali2, Vasileios Belagiannis5, Abouzar Eslami6, Nassir Navab2.
Abstract
Real-time visual tracking of a surgical instrument holds great potential for improving the outcome of retinal microsurgery by enabling new possibilities for computer-aided techniques such as augmented reality and automatic assessment of instrument manipulation. Due to high magnification and illumination variations, retinal microsurgery images usually entail a high level of noise and appearance changes. As a result, real-time tracking of the surgical instrument remains challenging in in-vivo sequences. To overcome these problems, we present a method that builds on random forests and addresses the task by modelling the instrument as an articulated object. A multi-template tracker reduces the region of interest to a rectangular area around the instrument tip by relating the movement of the instrument to the induced changes on the image intensities. Within this bounding box, a gradient-based pose estimation infers the location of the instrument parts from image features. In this way, the algorithm does not only provide the location of instrument, but also the positions of the tool tips in real-time. Various experiments on a novel dataset comprising 18 in-vivo retinal microsurgery sequences demonstrate the robustness and generalizability of our method. The comparison on two publicly available datasets indicates that the algorithm can outperform current state-of-the art.Keywords: Pose estimation; Retinal microsurgery; Visual tracking
Mesh:
Year: 2016 PMID: 27237604 DOI: 10.1016/j.media.2016.05.003
Source DB: PubMed Journal: Med Image Anal ISSN: 1361-8415 Impact factor: 8.545