Javad Fotouhi1,2, Mathias Unberath3,4, Tianyu Song3, Jonas Hajek3,5, Sing Chun Lee3,4, Bastian Bier3,5, Andreas Maier5, Greg Osgood6, Mehran Armand7,6, Nassir Navab3,4,8. 1. Computer Aided Medical Procedures, Johns Hopkins University, Baltimore, USA. javad.fotouhi@jhu.edu. 2. Department of Computer Science, Johns Hopkins University, Baltimore, USA. javad.fotouhi@jhu.edu. 3. Computer Aided Medical Procedures, Johns Hopkins University, Baltimore, USA. 4. Department of Computer Science, Johns Hopkins University, Baltimore, USA. 5. Pattern Recognition Lab, Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany. 6. Department of Orthopedic Surgery, Johns Hopkins Hospital, Baltimore, USA. 7. Applied Physics Laboratory, Johns Hopkins University, Baltimore, USA. 8. Computer Aided Medical Procedures, Technische Universität München, Munich, Germany.
Abstract
PURPOSE: Image-guided percutaneous interventions are safer alternatives to conventional orthopedic and trauma surgeries. To advance surgical tools in complex bony structures during these procedures with confidence, a large number of images is acquired. While image-guidance is the de facto standard to guarantee acceptable outcome, when these images are presented on monitors far from the surgical site the information content cannot be associated easily with the 3D patient anatomy. METHODS: In this article, we propose a collaborative augmented reality (AR) surgical ecosystem to jointly co-localize the C-arm X-ray and surgeon viewer. The technical contributions of this work include (1) joint calibration of a visual tracker on a C-arm scanner and its X-ray source via a hand-eye calibration strategy, and (2) inside-out co-localization of human and X-ray observers in shared tracking and augmentation environments using vision-based simultaneous localization and mapping. RESULTS: We present a thorough evaluation of the hand-eye calibration procedure. Results suggest convergence when using 50 pose pairs or more. The mean translation and rotation errors at convergence are 5.7 mm and [Formula: see text], respectively. Further, user-in-the-loop studies were conducted to estimate the end-to-end target augmentation error. The mean distance between landmarks in real and virtual environment was 10.8 mm. CONCLUSIONS: The proposed AR solution provides a shared augmented experience between the human and X-ray viewer. The collaborative surgical AR system has the potential to simplify hand-eye coordination for surgeons or intuitively inform C-arm technologists for prospective X-ray view-point planning.
PURPOSE: Image-guided percutaneous interventions are safer alternatives to conventional orthopedic and trauma surgeries. To advance surgical tools in complex bony structures during these procedures with confidence, a large number of images is acquired. While image-guidance is the de facto standard to guarantee acceptable outcome, when these images are presented on monitors far from the surgical site the information content cannot be associated easily with the 3D patient anatomy. METHODS: In this article, we propose a collaborative augmented reality (AR) surgical ecosystem to jointly co-localize the C-arm X-ray and surgeon viewer. The technical contributions of this work include (1) joint calibration of a visual tracker on a C-arm scanner and its X-ray source via a hand-eye calibration strategy, and (2) inside-out co-localization of human and X-ray observers in shared tracking and augmentation environments using vision-based simultaneous localization and mapping. RESULTS: We present a thorough evaluation of the hand-eye calibration procedure. Results suggest convergence when using 50 pose pairs or more. The mean translation and rotation errors at convergence are 5.7 mm and [Formula: see text], respectively. Further, user-in-the-loop studies were conducted to estimate the end-to-end target augmentation error. The mean distance between landmarks in real and virtual environment was 10.8 mm. CONCLUSIONS: The proposed AR solution provides a shared augmented experience between the human and X-ray viewer. The collaborative surgical AR system has the potential to simplify hand-eye coordination for surgeons or intuitively inform C-arm technologists for prospective X-ray view-point planning.
Authors: Everine B van de Kraats; Theo van Walsum; Lance Kendrick; Niels J Noordhoek; Wiro J Niessen Journal: Med Image Anal Date: 2005-08-11 Impact factor: 8.545
Authors: T De Silva; A Uneri; M D Ketcha; S Reaungamornrat; G Kleinszig; S Vogt; N Aygun; S-F Lo; J-P Wolinsky; J H Siewerdsen Journal: Phys Med Biol Date: 2016-03-18 Impact factor: 3.609