| Literature DB >> 26039422 |
Xianglong Kong1, Wenqi Wu2, Lilian Zhang3, Yujie Wang4.
Abstract
This paper presents a novel approach for estimating the ego-motion of a vehicle in dynamic and unknown environments using tightly-coupled inertial and visual sensors. To improve the accuracy and robustness, we exploit the combination of point and line features to aid navigation. The mathematical framework is based on trifocal geometry among image triplets, which is simple and unified for point and line features. For the fusion algorithm design, we employ the Extended Kalman Filter (EKF) for error state prediction and covariance propagation, and the Sigma Point Kalman Filter (SPKF) for robust measurement updating in the presence of high nonlinearities. The outdoor and indoor experiments show that the combination of point and line features improves the estimation accuracy and robustness compared to the algorithm using point features alone.Entities:
Keywords: point and line features; tightly-coupled; trifocal geometry; vision-aided inertial navigation
Year: 2015 PMID: 26039422 PMCID: PMC4507590 DOI: 10.3390/s150612816
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1(a) The point-line-point correspondence among three views; (b) Stereo geometry for two views and line-line-line configuration.
Figure 2Sample image with extracted point (red) and line (green) features.
Figure 3The motion trajectory plot on Google Maps. The initial position is denoted by a red square.
Figure 43D Position Errors of different solutions.
The overall RMSE of the outdoor experiment.
| Methods | Position RMSE (m) | Orientation RMSE (deg) |
|---|---|---|
| VINS (points and lines) | 10.6338 | 0.8313 |
| VINS (points only) | 16.4150 | 0.9126 |
| Pure INS | 2149.9 | 2.0034 |
| Pure stereo odometry | 72.6399 | 8.1809 |
Figure 5The velocity estimation errors and 3 bounds (the large deviations around 100th second is due to the ground truth errors).
Figure 6The attitude estimation errors and 3 bounds.
Figure 7Estimated gyroscope and accelerometer bias.
Figure 8Performance in low-textured indoor environment: (a) Experimental setup and experimental scene; (b) Top view of estimated trajectories; (c) The number of point and line inliers used to estimate the motion.
The accuracy specifications and sampling rates of the sensors.
| Sensors | Accuracies | Sampling Rates |
|---|---|---|
| IMU | Gyro bias stability (1 | 100 Hz |
| Stereo Camera | Resolution: 640 × 480 pixels Focus length: 3.8 mm Field of view: 70° Base line: 12 cm | 12 Hz |