Literature DB >> 26151933

Simultaneous Multi-Structure Segmentation and 3D Nonrigid Pose Estimation in Image-Guided Robotic Surgery.

Masoud S Nosrati, Rafeef Abugharbieh, Jean-Marc Peyrat, Julien Abinahed, Osama Al-Alao, Abdulla Al-Ansari, Ghassan Hamarneh.   

Abstract

In image-guided robotic surgery, segmenting the endoscopic video stream into meaningful parts provides important contextual information that surgeons can exploit to enhance their perception of the surgical scene. This information provides surgeons with real-time decision-making guidance before initiating critical tasks such as tissue cutting. Segmenting endoscopic video is a challenging problem due to a variety of complications including significant noise attributed to bleeding and smoke from cutting, poor appearance contrast between different tissue types, occluding surgical tools, and limited visibility of the objects' geometries on the projected camera views. In this paper, we propose a multi-modal approach to segmentation where preoperative 3D computed tomography scans and intraoperative stereo-endoscopic video data are jointly analyzed. The idea is to segment multiple poorly visible structures in the stereo/multichannel endoscopic videos by fusing reliable prior knowledge captured from the preoperative 3D scans. More specifically, we estimate and track the pose of the preoperative models in 3D and consider the models' non-rigid deformations to match with corresponding visual cues in multi-channel endoscopic video and segment the objects of interest. Further, contrary to most augmented reality frameworks in endoscopic surgery that assume known camera parameters, an assumption that is often violated during surgery due to non-optimal camera calibration and changes in camera focus/zoom, our method embeds these parameters into the optimization hence correcting the calibration parameters within the segmentation process. We evaluate our technique on synthetic data, ex vivo lamb kidney datasets, and in vivo clinical partial nephrectomy surgery with results demonstrating high accuracy and robustness.

Entities:  

Mesh:

Year:  2015        PMID: 26151933     DOI: 10.1109/TMI.2015.2452907

Source DB:  PubMed          Journal:  IEEE Trans Med Imaging        ISSN: 0278-0062            Impact factor:   10.048


  3 in total

1.  3D imaging applications for robotic urologic surgery: an ESUT YAUWP review.

Authors:  Enrico Checcucci; Daniele Amparore; Cristian Fiori; Matteo Manfredi; Morra Ivano; Michele Di Dio; Gabriel Niculescu; Federico Piramide; Giovanni Cattaneo; Pietro Piazzolla; Giovanni Enrico Cacciamani; Riccardo Autorino; Francesco Porpiglia
Journal:  World J Urol       Date:  2019-08-27       Impact factor: 4.226

2.  Projective biomechanical depth matching for soft tissue registration in laparoscopic surgery.

Authors:  Daniel Reichard; Dominik Häntsch; Sebastian Bodenstedt; Stefan Suwelack; Martin Wagner; Hannes Kenngott; Beat Müller-Stich; Lena Maier-Hein; Rüdiger Dillmann; Stefanie Speidel
Journal:  Int J Comput Assist Radiol Surg       Date:  2017-05-26       Impact factor: 2.924

3.  A survey of augmented reality methods to guide minimally invasive partial nephrectomy.

Authors:  Abderrahmane Khaddad; Jean-Christophe Bernhard; Gaëlle Margue; Clément Michiels; Solène Ricard; Kilian Chandelon; Franck Bladou; Nicolas Bourdel; Adrien Bartoli
Journal:  World J Urol       Date:  2022-07-01       Impact factor: 4.226

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.