| Literature DB >> 31013897 |
Mingjing Gao1, Min Yu2, Hang Guo3, Yuan Xu4,5.
Abstract
Multi-sensor integrated navigation technology has been applied to the indoor navigation and positioning of robots. For the problems of a low navigation accuracy and error accumulation, for mobile robots with a single sensor, an indoor mobile robot positioning method based on a visual and inertial sensor combination is presented in this paper. First, the visual sensor (Kinect) is used to obtain the color image and the depth image, and feature matching is performed by the improved scale-invariant feature transform (SIFT) algorithm. Then, the absolute orientation algorithm is used to calculate the rotation matrix and translation vector of a robot in two consecutive frames of images. An inertial measurement unit (IMU) has the advantages of high frequency updating and rapid, accurate positioning, and can compensate for the Kinect speed and lack of precision. Three-dimensional data, such as acceleration, angular velocity, magnetic field strength, and temperature data, can be obtained in real-time with an IMU. The data obtained by the visual sensor is loosely combined with that obtained by the IMU, that is, the differences in the positions and attitudes of the two sensor outputs are optimally combined by the adaptive fade-out extended Kalman filter to estimate the errors. Finally, several experiments show that this method can significantly improve the accuracy of the indoor positioning of the mobile robots based on the visual and inertial sensors.Entities:
Keywords: SIFT algorithm; adaptive fade-out extended Kalman filter; inertial sensor; robot positioning; visual sensor
Year: 2019 PMID: 31013897 PMCID: PMC6515221 DOI: 10.3390/s19081773
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Kinect coordinate system.
Figure 2Feature description area.
Figure 3Flow chart based on visual positioning.
Figure 4Loose combination and filtering system.
Figure 5Comparison of matching results (example 1).
Figure 6Comparison of matching results (example 2, the dataset of the images used in the experiment is from the Technical University of Munich, Germany).
Comparison of matching result (example 1).
| Percentage | SIFT Algorithm | Improved SIFT Algorithm | Optimization |
|---|---|---|---|
| Feature points | 158 | 116 | 26.58% |
| time (s) | 1.273 | 1.036 | 18.62% |
Comparison of matching result (example 2).
| Percentage | SIFT Algorithm | Improved SIFT Algorithm | Optimization |
|---|---|---|---|
| Feature points | 114 | 83 | 27.20% |
| time (s) | 0.943 | 0.775 | 17.82% |
Figure 7Simulation map.
Figure 8EKF algorithm simulation.
Figure 9AFEKF algorithm simulation.
Optional five control points (in meters).
| Number | 1 | 2 | 3 | 4 | 5 |
|---|---|---|---|---|---|
| Control Points | (0, −25) | (75, 60) | (18, 35) | (−93, −80) | (−20, −80) |
| EKF-SLAM | (0, −25) | (71, 60) | (10, 33) | (−82, −90) | (−5, −80) |
| AFEKF-SLAM | (0, −25) | (75, 60) | (18, 35) | (−100, −79) | (−22, −81) |
Error values of the points related to Table 3 (in meters).
| Number | 1 | 2 | 3 | 4 | 5 | Average |
|---|---|---|---|---|---|---|
| EKF-SLAM(X) | 0 | 4 | 8 | 11 | 15 | 7.6 |
| EKF-SLAM(Y) | 0 | 0 | 2 | 10 | 0 | 2.4 |
| AFEKF-SLAM(X) | 0 | 0 | 0 | 7 | 2 | 1.8 |
| AFEKF-SLAM(Y) | 0 | 0 | 0 | 1 | 1 | 0.4 |
Figure 10Straight track.
Control points’ coordinates in the straight line experiment (in meters).
| Waypoints | 1 | 2 | 3 | 4 | 5 | 6 |
|---|---|---|---|---|---|---|
| True Values | (0, 0) | (0, 0.2) | (0, 0.4) | (0, 0.6) | (0, 0.7) | (0, 1) |
| Kinect | (0, 0) | (0.04, 0.18) | (0.06, 0.38) | (0.08, 0.64) | (0.062, 0.72) | (0.09, 0.98) |
| IMU/Kinect | (0, 0) | (0.024, 0.19) | (0.021, 0.4) | (0.015, 0.58) | (0.021, 0.69) | (0.038, 1) |
Absolute values of the coordinate errors (in meters).
| Kinect(x) | Kinect(y) | IMU/Kinect(x) | IMU/Kinect(y) | |
|---|---|---|---|---|
| Average | 0.06 | 0.02 | 0.02 | 0.01 |
Figure 11Elliptical track.
Control points’ coordinates of the elliptical motion experiment (in meters).
| Waypoints | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
|---|---|---|---|---|---|---|---|
| True Values | (0, 0) | (−2, −1) | (−3.5, −4) | (−3, −6) | (−1, −7) | (1, −4) | (1.4, −2) |
| Kinect | (0, 0) | (−2, −0.8) | (−4.3, −4) | (−3.3, −5.8) | (−0.7, −7.1) | (0.92, −4.2) | (−0.1, 0) |
| IMU/Kinect | (0, 0) | (−1.7, −1) | (−3.5, −4) | (−3, −6) | (−1, −6.6) | (1, −4.4) | (1.3, −2) |
Absolute values of the coordinate errors (in meters).
| Kinect(x) | Kinect(y) | IMU/Kinect(x) | IMU/Kinect(y) | |
|---|---|---|---|---|
| Average | 0.55 | 0.39 | 0.06 | 0.12 |
Figure 12Polygon track.
Control points’ coordinates of the Polygon test (in meters).
| Waypoints | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
|---|---|---|---|---|---|---|---|---|---|
| True Values | (0, 0) | (2.5, 0) | (3.7, 1) | (3.7, 2.3) | (3, 3) | (2, 3) | (1.3, 2.2) | (0, 1.5) | (0, 0) |
| Kinect | (0, 0) | (2.5, −0.16) | (3.9, 1.5) | (3.85, 2.7) | (3.1, 3.2) | (2, 2.9) | (1.3, 2.5) | (−0.1, 1.4) | (0, −0.2) |
| IMU/Kinect | (0, 0) | (2.5, 0.016) | (3.3, 1.5) | (3.7, 2.3) | (2.9, 3.1) | (2, 2.6) | (1.3, 2.3) | (0.1, 1.3) | (0, −0.05) |
Absolute values of the coordinate errors (in meters).
| Kinect(x) | Kinect(y) | IMU/Kinect(x) | IMU/Kinect(y) | |
|---|---|---|---|---|
| Average | 0.11 | 0.22 | 0.06 | 0.15 |