| Literature DB >> 27548770 |
Matthias Nathanaël van Oosterom1, Myrthe Adriana Engelen2, Nynke Sjoerdtje van den Berg3, Gijs Hendrik KleinJan3, Henk Gerrit van der Poel4, Thomas Wendler5, Cornelis Jan Hadde van de Velde6, Nassir Navab7, Fijs Willem Bernhard van Leeuwen3.
Abstract
Robot-assisted laparoscopic surgery is becoming an established technique for prostatectomy and is increasingly being explored for other types of cancer. Linking intraoperative imaging techniques, such as fluorescence guidance, with the three-dimensional insights provided by preoperative imaging remains a challenge. Navigation technologies may provide a solution, especially when directly linked to both the robotic setup and the fluorescence laparoscope. We evaluated the feasibility of such a setup. Preoperative single-photon emission computed tomography/X-ray computed tomography (SPECT/CT) or intraoperative freehand SPECT (fhSPECT) scans were used to navigate an optically tracked robot-integrated fluorescence laparoscope via an augmented reality overlay in the laparoscopic video feed. The navigation accuracy was evaluated in soft tissue phantoms, followed by studies in a human-like torso phantom. Navigation accuracies found for SPECT/CT-based navigation were 2.25 mm (coronal) and 2.08 mm (sagittal). For fhSPECT-based navigation, these were 1.92 mm (coronal) and 2.83 mm (sagittal). All errors remained below the <1-cm detection limit for fluorescence imaging, allowing refinement of the navigation process using fluorescence findings. The phantom experiments performed suggest that SPECT-based navigation of the robot-integrated fluorescence laparoscope is feasible and may aid fluorescence-guided surgery procedures.Entities:
Mesh:
Year: 2016 PMID: 27548770 DOI: 10.1117/1.JBO.21.8.086008
Source DB: PubMed Journal: J Biomed Opt ISSN: 1083-3668 Impact factor: 3.170