| Literature DB >> 35214294 |
Pingping Peng1, Chao Yu1,2,3, Qihao Xia1, Zhengqi Zheng1,2, Kun Zhao1,2, Wen Chen1,2,3.
Abstract
Continuous positioning and tracking of multi-pedestrian targets is a common concern for large indoor space security, emergency evacuation, location services, and other application areas. Among the sensors used for positioning, the ultra-wide band (UWB) is a critical way to achieve high-precision indoor positioning. However, due to the existence of indoor Non-Line-of-Sight (NLOS) error, a single positioning system can no longer meet the requirement for positioning accuracy. This research aimed to design a high-precision and stable fusion positioning system which is based on the UWB and vision. The method uses the Hungarian algorithm to match the identity of the UWB and vision localization results, and, after successful matching, the fusion localization is performed by the federated Kalman filtering algorithm. In addition, due to the presence of colored noise in indoor positioning data, this paper also proposes a Kalman filtering algorithm based on principal component analysis (PCA). The advantage of this new filtering algorithm is that it does not have to establish the dynamics model of the distribution hypothesis and requires less calculation. The PCA algorithm is firstly used to minimize the correlation of the observables, thus providing a more reasonable Kalman gain by energy estimation and the denoised data, which are substituted into Kalman prediction equations. Experimental results show that the average accuracy of the UWB and visual fusion method is 25.3% higher than that of the UWB. The proposed method can effectively suppress the influence of NLOS error on the positioning accuracy because of the high stability and continuity of visual positioning. Furthermore, compared with the traditional Kalman filtering, the mean square error of the new filtering algorithm is reduced by 31.8%. After using the PCA-Kalman filtering, the colored noise is reduced and the Kalman gain becomes more reasonable, facilitating accurate estimation of the state by the filter.Entities:
Keywords: PCA; UWB; fusion; indoor positioning; vision
Year: 2022 PMID: 35214294 PMCID: PMC8963045 DOI: 10.3390/s22041394
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1TDOA positioning schematic diagram.
Figure 2PCA–Kalman algorithm block diagram.
Figure 3Visual target tracking results.
Figure 4Results of visual persistent positioning.
Figure 5Flowchart of the fusion location algorithm model.
Figure 6Experimental Environment.
Figure 7Experimental Scenario 1.
Figure 8Experimental Scenario 2.
Figure 9Positioning results.
Figure 10Error of the X-axis.
Figure 11Error of the Y-axis.
Figure 12Error probability distribution.
Figure 13Experimental scenario with three obstacles.
Figure 14The effect of the DeepSort algorithm with obstacles.
Figure 15Fusion positioning results with different number of obstacles.
Figure 16Fusion positioning results with different lengths of time of the human target loss.
Positioning results under different experimental conditions.
| Scheme | Number of Human Targets | Time Length of Human Target Loss/s | Number of Obstacles | Mean Error of Visual Positioning/cm | Mean Error of UWB Positioning/cm | Mean Error of Fusion Positioning/cm |
|---|---|---|---|---|---|---|
| ① | 2 | 3 | 0 | 30.16 | 33.25 | 22.43 |
| ② | 1 | 3 | 0 | 29.72 | 30.89. | 20.94 |
| ③ | 1 | 5 | 0 | 29.45 | 32.23 | 21.28 |
| ④ | 1 | 1 | 0 | 28.58 | 33.99 | 21.99 |
| ⑤ | 1 | 0 | 2 | 27.73 | 36.58 | 23.45 |
| ⑥ | 1 | 0 | 3 | 28.25 | 38.56 | 25.08 |
Figure 17Positioning results. The experimenter starts from the red star.
Figure 18Error probability distribution.