Literature DB >> 35286586

Augmented reality navigation with real-time tracking for facial repair surgery.

Long Shao1, Tianyu Fu2, Zhao Zheng1, Zehua Zhao1, Lele Ding1, Jingfan Fan1, Hong Song3, Tao Zhang4, Jian Yang5.   

Abstract

PURPOSE: Facial repair surgeries (FRS) require accuracy for navigating the critical anatomy safely and quickly. The purpose of this paper is to develop a method to directly track the position of the patient using video data acquired from the single camera, which can achieve noninvasive, real time, and high positioning accuracy in FRS.
METHODS: Our method first performs camera calibration and registers the surface segmented from computed tomography to the patient. Then, a two-step constraint algorithm, which includes the feature local constraint and the distance standard deviation constraint, is used to find the optimal feature matching pair quickly. Finally, the movements of the camera and the patient decomposed from the image motion matrix are used to track the camera and the patient, respectively.
RESULTS: The proposed method achieved fusion error RMS of 1.44 ± 0.35, 1.50 ± 0.15, 1.63 ± 0.03 mm in skull phantom, cadaver mandible, and human experiments, respectively. The above errors of the proposed method were lower than those of the optical tracking system-based method. Additionally, the proposed method could process video streams up to 24 frames per second, which can meet the real-time requirements of FRS.
CONCLUSIONS: The proposed method does not rely on tracking markers attached to the patient; it could be executed automatically to maintain the correct augmented reality scene and overcome the decrease in positioning accuracy caused by patient movement during surgery.
© 2022. CARS.

Entities:  

Keywords:  Augmented reality; Facial repair surgeries; Motion estimation; Pose tracking

Mesh:

Year:  2022        PMID: 35286586     DOI: 10.1007/s11548-022-02589-0

Source DB:  PubMed          Journal:  Int J Comput Assist Radiol Surg        ISSN: 1861-6410            Impact factor:   2.924


  5 in total

1.  Development and comparison of new hybrid motion tracking for bronchoscopic navigation.

Authors:  Xióngbiāo Luó; Marco Feuerstein; Daisuke Deguchi; Takayuki Kitasaka; Hirotsugu Takabatake; Kensaku Mori
Journal:  Med Image Anal       Date:  2010-12-13       Impact factor: 8.545

2.  Point Cloud Saliency Detection by Local and Global Feature Fusion.

Authors:  Xiaoying Ding; Weisi Lin; Zhenzhong Chen; Xinfeng Zhang
Journal:  IEEE Trans Image Process       Date:  2019-05-30       Impact factor: 10.856

3.  Perception enhancement using importance-driven hybrid rendering for augmented reality based endoscopic surgical navigation.

Authors:  Yakui Chu; Xu Li; Xilin Yang; Danni Ai; Yong Huang; Hong Song; Yurong Jiang; Yongtian Wang; Xiaohong Chen; Jian Yang
Journal:  Biomed Opt Express       Date:  2018-10-04       Impact factor: 3.732

4.  Augmented reality surgical navigation with accurate CBCT-patient registration for dental implant placement.

Authors:  Longfei Ma; Weipeng Jiang; Boyu Zhang; Xiaofeng Qu; Guochen Ning; Xinran Zhang; Hongen Liao
Journal:  Med Biol Eng Comput       Date:  2018-07-02       Impact factor: 2.602

5.  A fast and accurate feature-matching algorithm for minimally-invasive endoscopic images.

Authors:  Gustavo A Puerto-Souza; Gian-Luca Mariottini
Journal:  IEEE Trans Med Imaging       Date:  2013-01-14       Impact factor: 10.048

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.