| Literature DB >> 34304631 |
Zhuo Zhao1, Jasmin Poyhonen2, Xin Chen Cai2, Frances Sophie Woodley Hooper2, Yangmyung Ma3, Yihua Hu2, Hongliang Ren4, Wenzhan Song1, Zion Tsz Ho Tse2.
Abstract
Image-guided therapies have been on the rise in recent years as they can achieve higher accuracy and are less invasive than traditional methods. By combining augmented reality technology with image-guided therapy, more organs, and tissues can be observed by surgeons to improve surgical accuracy. In this review, 233 publications (dated from 2015 to 2020) on the design and application of augmented reality-based systems for image-guided therapy, including both research prototypes and commercial products, were considered for review. Based on their functions and applications. Sixteen studies were selected. The engineering specifications and applications were analyzed and summarized for each study. Finally, future directions and existing challenges in the field were summarized and discussed.Entities:
Keywords: Augmented reality; image-guided therapy; surgery navigation
Mesh:
Year: 2021 PMID: 34304631 PMCID: PMC8573682 DOI: 10.1177/09544119211034357
Source DB: PubMed Journal: Proc Inst Mech Eng H ISSN: 0954-4119 Impact factor: 1.617
Figure 1.Research flowchart of this study.
Summary of the applications of commercial AR products in image-guided interventions.
| Application | Working principle | Limitation | Ref. |
|---|---|---|---|
| Computer-assisted orthopedic surgery | Bone location and CT data are visualized as 3D AR through HoloLens. | A trackable target needs to be attached to the bone. The system has a narrow FOV and poor depth perception. | El-Hariri et al.
|
| Irreversible electroporation in the pancreas | AR guides for needle insertion and target needle trajectory are visualized with HoloLens. | There is a lack of reliable APIs for tracking moving objects or accessing raw IR sensor data. The system has a limited FOV and experiences tracking lag. | Kuzhagaliyev et al.
|
| Prostate interventions | Holographic MRI and US images are visualized with HoloLens. | HoloLens has limited computational power and memory. More extensive studies in clinical practice are required. | Mojica et al.
|
| Needle targeting | A HoloLens headset based device uses Unity, Mixed Reality Toolkit Foundation, and Vuforia to achieve visualization, interaction, and registration. | Patient motion, respiration, needle bending, and tissue deformation were not considered in this evaluation. | Park et al.
|
| Needle targeting | Real-time position information of fiducial markers and needle handle combined with pre-acquired CT images are visualized with Endosight. | To achieve good needle targeting results, breathing control is needed. | Solbiati et al.
|
| External osteoplastic approaches to the frontal sinus | 3D hologram of head in Magic Leap AR goggles based on pre-acquired CT images | Accuracy is not high enough to be used in clinical practice. | Neves et al.
|
Figure 2.HoloLens based image-guided interventions: (a) overview of the HoloLens-based AR system, (b) projected AR image with registered US probe, (c) topology (left) and photograph (right) of the holographic AR interface, showing the operator immersed in a scene that includes 3D holographic structures and an embedded 2D virtual display window and (d) left: Participant inserted the needle guided by AR system. Top right: View of the needle insertion procedure without AR system. Bottom right: View of the needle insertion procedure with AR system.
Figure 3.(a) Illustration of Endosight AR system with porcine model for needle insertion. Top: Real-time AR image, wherein the red line is the needle, the blue line is the needle handle, the yellow line is the target, and the red curved line is the outline of the porcine. Bottom: Correspondent CT images, among which the leftmost image is the CT image before needle insertion and the rest are after needle insertion. The yellow arrow indicates the position of the target and needle. (b) The 3D hologram from Magic Leap AR system superimposed to the cadaveric head.
Summary of the applications of AR in neurosurgery.
| Application | Working principle | Limitations | Ref. |
|---|---|---|---|
| Surgeries to treat aneurysms, arteriovenous malformations, and arteriovenous fistulae | A live view of the patient is captured by a camera, and this view is merged with preoperative volume-rendered vessels. | Little value is added by the AR view in terms of localizing the large draining vein. Sometimes arteries are mistaken for veins and vice versa. | Kersten-Oertel et al.
|
| Image-guided neurosurgery | The Intraoperative Brain Imaging System platform is run on a mobile device to provide real-time augmentation and user interactivity. | NA | Léger et al.
|
Figure 4.AR technology for neurosurgeries. (a) The experiment setup of the AR system developed for neurovascular surgeries . (b) Photographs of the AR system developed for image-guided neurosurgery .
Research prototypes for AR systems in laparoscopic surgery.
| Procedure | Working Principle | Limitations | Ref |
|---|---|---|---|
| Laparoscopic liver resection | Laparoscopic video is overlaid on live laparoscopic ultrasound (LUS), which is enabled by electromagnetic (EM) tracking. | Lack of control over lesion creation with ablation. Long clinical workflow since the AR system requires one day before surgery for the initial setup. | Lau et al.
|
| Laparoscopy | Stereoscopic laparoscopic video is overlaid on live laparoscopic ultrasound (LUS), which is enabled by EM tracking. | Registration accuracy is reduced with increased EM sensor distance to the center of the EM field, limiting the range of motion of the scope and LUS probe and the position of the patient. Long clinical workflow since AR system requires calibration a day prior to surgery. | Liu et al.
|
| Laparoscopy | A support vector machine (SVM) is trained to perform multi-organ semantic segmentation and stereo surface reconstruction from live stereo endoscopy and preoperative 3D models. | Classification tests were completed on ex vivo tissue organs only. Full AR tests were completed on phantom anatomy only. AR system relies on no alteration in the patient’s anatomy after preoperative organ models are constructed. | Penza et al.
|
| Robot-assisted laparoscopic radical prostatectomy (RALRP) | Preoperative MRI images are registered to live transrectal ultrasound (TRUS). The coordinate system of TRUS is calibrated with the robotic surgical suite. | Additional surgical time is required to allow for TRUS acquisition (30 s) and preoperative MRI registration (2 min). The poor accuracy of measurements of intrinsic and extrinsic endoscopic parameters results in the AR scene displaying underneath the endoscope’s feed rather than overlaid on top. | Samei et al.
|
Figure 5.AR for laparoscopic surgery. (a) Laparoscopic video and LUS operational setup . (b) Stereoscopic and LUS AR setup with EM tracking . (c) Context-aware AR scenes of the kidney (left to right, top to bottom): semantic segmentation, point cloud reconstruction, 3D model, and final scene . (d) AR scene with 3D rendered TRUS probe and instruments, TRUS plane orientation (blue), and prostate and MRI volume slice .
AR-based research prototypes in various interventional procedures.
| Application | Working principle | Limitations | Ref. |
|---|---|---|---|
| Needle positioning during radiofrequency ablation (RFA) of liver tumors | A virtual model of patient anatomy is reconstructed from CT images and registered to the patient using infrared retroreflective markers and an optical tracker. The optical tracker also tracks the needle. | No phantom patient experimentation was performed due to misalignment of the virtual model and dummy phantom. The measurement volume of the optical tracker is small (392 mm × 938 mm × 491 mm), and the position of the optical tracker is an obstacle to the surgical staff and could be interfered with by the surgeon’s body. The AR system relies on no alteration in the patient’s anatomy occurring between CT scans and surgery, and it is unsuitable for moving or hollow non-rigid target organs. | De Paolis and colleagues[ |
| Needle positioning during ultrasound-guided interventions | Fiducial marker tracking is used to track the positions of the needle and US probe during in situ US imaging. The trajectory of the needle is calculated and visualized. | An optimal orthogonal tracking angle is not always possible. Needle tip location takes time to reach a steady state. | Kanithi et al.
|
| Needle positioning during transperineal prostate procedures | AR app for smartphone or smart glass overlays anatomical features and surgical plan obtained from MRI/CT onto the patient by visually tracking an image marker fixed to the patient. | The system is difficult to use, provides an unstable display of virtual features with smart glass, does not work properly during close-up view, and offers no accountability for motion. | Li et al.
|
| Needle insertion guidance | AR app for smartphone uses CT imaging data of phantom, a 3D reference marker, image analysis, and visualization software to provide needle trajectory planning and real-time guidance. | Does not provide real-time corrections for respiratory movement in mobile organs or lesion movement. Needle bending causes navigational inaccuracies. | Hecht et al.
|
Figure 6.AR technology for interventional procedure. (a) An occluded virtual window of the abdomen with surgical instrument distance information . (b) Projected AR view and virtual needle trajectory . (c) AR App interface. Left: Lesion target overlaid on the prostate phantom. Right: Planned needle trajectory . (d) Smartphone screen with intentionally off-axis needle placement. Green line: virtual needle trajectory. Red dot: target lesion. Yellow dot: entry point. Navy dot: depth marker. Proper insertion depth is achieved when the needle hub base (arrowhead) coincides with the virtual depth marker (navy dot) .