| Literature DB >> 35957295 |
Ahmed M M Almassri1,2, Natsuki Shirasawa1, Amarbold Purev1, Kaito Uehara1, Wataru Oshiumi1, Satoru Mishima1, Hiroaki Wagatsuma1.
Abstract
This study presents an effective artificial neural network (ANN) approach to combine measurements from inertial measurement units (IMUs) and time-of-flight (TOF) measurements from an ultra-wideband (UWB) system with OptiTrack Motion Capture System (OptiT-MCS) data to guarantee the positioning accuracy of motion tracking in indoor environments. The proposed fusion approach unifies the following advantages of both technologies: high data rates from the MCS, and global translational precision from the inertial measurement unit (IMU)/UWB localization system. Consequently, it leads to accurate position estimates when compared with data from the IMU/UWB system relative to the OptiT-MCS reference system. The calibrations of the positioning IMU/UWB and MCS systems are utilized in real-time movement with a diverse set of motion recordings using a mobile robot. The proposed neural network (NN) approach experimentally revealed accurate position estimates, giving an enhancement average mean absolute percentage error (MAPE) of 17.56% and 7.48% in the X and Y coordinates, respectively, and the coefficient of correlation R greater than 99%. Moreover, the experimental results prove that the proposed NN fusion is capable of maintaining high accuracy in position estimates while preventing drift errors from increasing in an unbounded manner, implying that the proposed approach is more effective than the compared approaches.Entities:
Keywords: indoor positioning system; inertial measurement units; motion capture system; neural network; robot position measurement; sensor fusion; ultra-wideband
Mesh:
Year: 2022 PMID: 35957295 PMCID: PMC9371076 DOI: 10.3390/s22155737
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Figure 1Block diagram of the neural network in training and application stages.
Figure 2Proposed NN fusion position estimation using the integration of IMU/UWB with MCS data fusion.
Figure 3Position measurement system of IMU/UWB and MCS using a mobile robot.
Configuration parameters of the UWB system.
| UWB Setting | Values |
|---|---|
| Channel | 2 |
| Positioning protocols | Time-Of-Flight (TOF) |
| Update rate | 11 Hz |
| Bandwidth | 499.2 MHz |
| Data bitrate | 850 Kb/sec |
| Pulse repetition frequency (PRF) | 64 M Hz |
| Preamble length | 2048 |
| Transmit power Tx gain | 15.5 dB |
The anchors’ coordinates of UWB system.
| Anchors’ Coordinates | |||
|---|---|---|---|
| Anchor No. | X (mm) | Y (mm) | Z (mm) |
| 1 | −1000 | 0 | 1500 |
| 2 | 4500 | 0 | 600 |
| 3 | 4180 | 4230 | 1000 |
| 4 | −1000 | 4200 | 400 |
Figure 4Test area used in three scenarios: (a) circular motion, (b) random motion on Y-axis, and (c) random motion on X-axis.
Figure 5Omni wheels mobile robot attached with Tag/IMU and passive retroreflective marker.
Figure 6Schematic of four-layer artificial neural network fusion data.
The specifications and parameters of ANN model and development environment.
| Training Parameters/Component | Values | Note |
|---|---|---|
| Neural network model used | Feed forward | |
| Input nodes | 2 | X, Y Position of IMU/UWB Sensor |
| Hidden layer | 2 | |
| Hidden layer neurons | 20 | |
| Output layer neurons | 2 | |
| Output nodes | 2 | |
| Training network algorithm | LMBP | |
| Training percentage | 70 | |
| Testing percentage | 15 | |
| Validation percentage | 15 | |
| Transfer function hidden layers | Tansig | |
| Transfer function output layer | Pure line | |
| Data division | Random | |
| No. of epochs | 1000 | |
| Validation checks (iterations) | 6 | |
| Performance | Mean squared error (MSE) | |
| IDE | MATLAB R2019a | |
| Operating System | Window 10 | |
| CPU | Intel(R) Core(TM) i9-9900K CPU @ 3.60 GHz | |
| Memory | 64 GB |
Data point calibrations of tracking position of the three experiments for different algorithms.
| Experiment Type /Sampling Rate (Hz) | IMU/UWB | MCS | IMU/UWB | OptiT-MCS | Fusion |
|---|---|---|---|---|---|
| 11 | 50 | 10 | 10 | 10 | |
|
|
| ||||
| Circular Motion | 63,568 | 294,405 | 57,772 | 59,033 | 58,797 |
| Random 1 Motion | 13,253 | 62,855 | 12,380 | 12,589 | 12,393 |
| Random 2 Motion | 13,619 | 61,202 | 12,194 | 12,402 | 12,318 |
| Total | 90,440 | 418,462 | 82,346 | 84,024 | 83,508 |
Figure 7Trajectory position tracking of the mobile robot in X-Y coordinates by IMU/UWB system compared to OptiT-MCS for 150 trials of three types of experiments.
Figure 8Trajectory position tracking of mobile robot in X-Y coordinates by IMU/UWB system and proposed NN compared to OptiT-MCS for one trial of three types of experiments.
Metric errors prediction results using MAD, MAPE, and RMSE in X-Y axis of the three experiments for different algorithms.
| Experiment Type | IMU/UWB | NN | IMU/UWB | NN |
|---|---|---|---|---|
| X Coordinate Error | Y Coordinate Error (mm/%) | |||
| MAD/MAPE/RMSE | MAD/MAPE/RMSE | |||
| Circular Motion | 120.07/56.88/142.3 | 57.49/10.96/74.93 | 76.51/26.34/96.55 | 58.94/10.78/77.99 |
| Random 1 Motion | 142.08/10.61/166.4 | 75.34/6.08/98.81 | 186.39/16.42/228.86 | 146.44/12.11/185.15 |
| Random 2 Motion | 132.85/9.36/158.59 | 106.30/7.15/135.7 | 66.91/8.39/82.11 | 64.54/5.77/80.60 |
| Total Average error | 131.67/25.62/155.76 | 79.71/8.06/103.14 | 109.94/17.03/135.84 | 89.97/9.55/114.58 |
Figure 9Metric errors prediction results using MAD in X and Y coordinates of 150 experimental trials.