| Literature DB >> 32556919 |
Baoru Huang1, Ya-Yen Tsai2, João Cartucho2, Kunal Vyas3, David Tuch3, Stamatia Giannarou2, Daniel S Elson2.
Abstract
PURPOSE: In surgical oncology, complete cancer resection and lymph node identification are challenging due to the lack of reliable intraoperative visualization. Recently, endoscopic radio-guided cancer resection has been introduced where a novel tethered laparoscopic gamma detector can be used to determine the location of tracer activity, which can complement preoperative nuclear imaging data and endoscopic imaging. However, these probes do not clearly indicate where on the tissue surface the activity originates, making localization of pathological sites difficult and increasing the mental workload of the surgeons. Therefore, a robust real-time gamma probe tracking system integrated with augmented reality is proposed.Entities:
Keywords: Image-guided surgery; Minimally invasive surgery; Pose estimation; Prostate cancer; Tethered laparoscopic gamma probe; Tracking
Year: 2020 PMID: 32556919 PMCID: PMC7351835 DOI: 10.1007/s11548-020-02205-z
Source DB: PubMed Journal: Int J Comput Assist Radiol Surg ISSN: 1861-6410 Impact factor: 2.924
Fig. 1a An example of a tethered probe being used in MIS; b the gamma probe marker; c example detected circular dots and chessboard vertices; d the local coordinates defined on the probe
Fig. 2Feature detection algorithm workflow
Fig. 6The hardware setup including laparoscope, image monitor, prostate phantom, ‘SENSEI’ probe, and control unit showing a, b a higher radiation level when the probe was pointing to and placed closer to the radioactive source; and c, d a lower radiation level when the probe was pointing to the edge of the source. The grey dashed circles in b, d show the position of radioactive Cobalt-57 source while the green circles represent the intersection area of the gamma probe axis and the tissue
Fig. 3a Hardware setup for experiments; b the transformation matrixes between laparoscope, OptiTrack system, optical sensors and designed marker
Fig. 4a Tracking results in the case of occlusion; b the experimental results for different testing distances between the probe and camera
Summary of pose estimation error
| Different marker | Translation mean error ± STD (mm) | Rotation mean error ± STD ( | ||
|---|---|---|---|---|
| Our hybrid marker | Circular dots | Chessboard vertices | Circular dots | Chessboard vertices |
| Previous hybrid marker [ | ||||
3D tip distance when the cone tip is fixed
| 3D projection error | |||
|---|---|---|---|
| Different marker | Mean error ± STD (mm) | Maximum error (mm) | Minimum error (mm) |
| Previous hybrid marker [ | 137.72 | 0.00 | |
| Previous hybrid marker [ | 5.41 | 0.00 | |
| Our hybrid marker | 1.90 | 0.00 | |
Maximum detectable distance and rotation angle around different axes
| Rotation axis | Previous work [ | Dual-pattern marker (ours) |
|---|---|---|
| Roll ( | ||
| Pitch ( | ||
| Yaw ( | ||
| Distance to camera (mm) | 60–200 | 50–220 |
Fig. 5Examples where the pose estimation is more accurate by using a the circular dots pattern and b the chessboard vertices. Example where tracking failed for c the circular dots pattern and d the chessboard vertices. In e both vertices and dots pattern are detected in adjacent three marker lines