Literature DB >> 33747673

Plane-Aided Visual-Inertial Odometry for 6-DOF Pose Estimation of a Robotic Navigation Aid.

H E Zhang1, Cang Ye1.   

Abstract

The classic visual-inertial odometry (VIO) method estimates the 6-DOF pose of a moving camera by fusing the camera's ego-motion estimated by visual odometry (VO) and the motion measured by an inertial measurement unit (IMU). The VIO attempts to updates the estimates of the IMU's biases at each step by using the VO's output to improve the accuracy of IMU measurement. This approach works only if an accurate VO output can be identified and used. However, there is no reliable method that can be used to perform an online evaluation of the accuracy of the VO. In this paper, a new VIO method is introduced for pose estimation of a robotic navigation aid (RNA) that uses a 3D time-of-flight camera for assistive navigation. The method, called plane-aided visual-inertial odometry (PAVIO), extracts planes from the 3D point cloud of the current camera view and track them onto the next camera view by using the IMU's measurement. The covariance matrix of each tracked plane's parameters is computed and used to perform a plane consistent check based on a chi-square test to evaluate the accuracy of VO's output. PAVIO accepts a VO output only if it is accurate. The accepted VO outputs, the information of the extracted planes, and the IMU's measurements over time are used to create a factor graph. By optimizing the graph, the method improves the accuracy in estimating the IMU bias and reduces the camera's pose error. Experimental results with the RNA validate the effectiveness of the proposed method. PAVIO can be used to estimate the 6-DOF pose for any 3D-camera-based visual-inertial navigation system.

Entities:  

Keywords:  VO/SLAM; Visual-inertial odometry; pose estimation; robotic navigation aid

Year:  2020        PMID: 33747673      PMCID: PMC7977623          DOI: 10.1109/access.2020.2994299

Source DB:  PubMed          Journal:  IEEE Access        ISSN: 2169-3536            Impact factor:   3.367


  4 in total

1.  Direct Sparse Odometry.

Authors:  Jakob Engel; Vladlen Koltun; Daniel Cremers
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2017-04-12       Impact factor: 6.226

2.  6-DOF Pose Estimation of a Robotic Navigation Aid by Tracking Visual and Geometric Features.

Authors:  Cang Ye; Soonhac Hong; Amirhossein Tamjidi
Journal:  IEEE Trans Autom Sci Eng       Date:  2015-10-05       Impact factor: 5.083

3.  An Indoor Wayfinding System Based on Geometric Features Aided Graph SLAM for the Visually Impaired.

Authors:  He Zhang; Cang Ye
Journal:  IEEE Trans Neural Syst Rehabil Eng       Date:  2017-03-15       Impact factor: 3.802

4.  Co-Robotic Cane: A New Robotic Navigation Aid for the Visually Impaired.

Authors:  Cang Ye; Soonhac Hong; Xiangfei Qian; Wei Wu
Journal:  IEEE Syst Man Cybern Mag       Date:  2016-08-24
  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.