| Literature DB >> 30464844 |
Mathias Unberath1, Javad Fotouhi1, Jonas Hajek1,2, Andreas Maier2, Greg Osgood3, Russell Taylor4, Mehran Armand3,5, Nassir Navab1,6.
Abstract
Interventional C-arm imaging is crucial to percutaneous orthopedic procedures as it enables the surgeon to monitor the progress of surgery on the anatomy level. Minimally invasive interventions require repeated acquisition of X-ray images from different anatomical views to verify tool placement. Achieving and reproducing these views often comes at the cost of increased surgical time and radiation. We propose a marker-free 'technician-in-the-loop' Augmented Reality (AR) solution for C-arm repositioning. The X-ray technician operating the C-arm interventionally is equipped with a head-mounted display system capable of recording desired C-arm poses in 3D via an integrated infrared sensor. For C-arm repositioning to a target view, the recorded pose is restored as a virtual object and visualized in an AR environment, serving as a perceptual reference for the technician. Our proof-of-principle findings from a simulated trauma surgery indicate that the proposed system can decrease the 2.76 X-ray images required for re-aligning the scanner with an intra-operatively recorded C-arm view down to zero, suggesting substantial reductions of radiation dose. The proposed AR solution is a first step towards facilitating communication between the surgeon and the surgical staff, improving the quality of surgical image acquisition, and enabling context-aware guidance for surgery rooms of the future.Entities:
Keywords: C-arm interventionally; C-arm repositioning; X-ray images; X-ray technician; augmented reality; augmented reality-based feedback; biomedical equipment; computerised tomography; integrated infrared sensor; interventional C-arm imaging; intra-operatively recorded C-arm; marker-free technician-in-the-loop augmented reality solution; medical image processing; minimally invasive interventions; orthopaedic trauma surgery; orthopaedics; percutaneous orthopaedic procedures; radiation dose; surgery; surgical image acquisition; technician-in-the-loop design; wearable computers
Year: 2018 PMID: 30464844 PMCID: PMC6222181 DOI: 10.1049/htl.2018.5066
Source DB: PubMed Journal: Healthc Technol Lett ISSN: 2053-3713
Fig. 1Spatial relations that must be estimated dynamically to enable the proposed AR environment. Transformations shown in black are estimated directly while transformations shown in orange are derived
Fig. 2All images are shown from the X-ray technician's point of view
a Live 3D point cloud computed from the infrared depth image is displayed in red. This intra-operative point cloud is then saved for re-use
b C-arm has been moved to a different pose; the previously saved point cloud is visualised in green and serves as a reference to achieve the previous pose. After successful repositioning of the C-arm shown in:
c Saved and current point clouds shown in green and red, respectively, coincide. This means that the C-arm has been repositioned appropriately
Fig. 3X-ray technician operating the C-arm during the experiment. Typical angulations for pelvic trauma surgery were selected according to [6]
a, b We show inlet and outlet views,
c, d Caudal oblique views, and
e, f Cranial oblique views, respectively
C-arm pose differences as per infrared marker tracking
| Proposed | Conventional | |
|---|---|---|
| mean distance ± SD | 51.6 ± 19.2 mm | 16.7 ± 6.3 mm |
| angle ± SD | 1.54 ± 0.92° | 1.23 ± 0.45° |
Projection domain keypoint displacement in pixels (px)
| Proposed | Conventional on first try | Conventional |
|---|---|---|
| 210 ± 105 px | 257 ± 171 px | 68 ± 36 px |