| Literature DB >> 28867775 |
Phong Ha Nguyen1, Ki Wan Kim2, Young Won Lee3, Kang Ryoung Park4.
Abstract
Unmanned aerial vehicles (UAVs), which are commonly known as drones, have proved to be useful not only on the battlefields where manned flight is considered too risky or difficult, but also in everyday life purposes such as surveillance, monitoring, rescue, unmanned cargo, aerial video, and photography. More advanced drones make use of global positioning system (GPS) receivers during the navigation and control loop which allows for smart GPS features of drone navigation. However, there are problems if the drones operate in heterogeneous areas with no GPS signal, so it is important to perform research into the development of UAVs with autonomous navigation and landing guidance using computer vision. In this research, we determined how to safely land a drone in the absence of GPS signals using our remote maker-based tracking algorithm based on the visible light camera sensor. The proposed method uses a unique marker designed as a tracking target during landing procedures. Experimental results show that our method significantly outperforms state-of-the-art object trackers in terms of both accuracy and processing time, and we perform test on an embedded system in various environments.Entities:
Keywords: UAV landing; remote marker-based tracking; unmanned aerial vehicle (UAV); visible light camera sensor
Year: 2017 PMID: 28867775 PMCID: PMC5621353 DOI: 10.3390/s17091987
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Summary of comparisons of proposed and previous studies.
| Categories | Sub-Categories | Type of Camera | Time for Drone Landing | Descriptions | Strength | Weakness |
|---|---|---|---|---|---|---|
| Passive methods | A trinocular system with three visible light FireWire cameras | Daytime [ | Color landmarks on the UAV are deployed as key features for position estimation by a ground station using CamShift algorithm. | This research not only achieved good results for landing and positioning task, but also practical for real-time applications. | The feature extraction algorithm is not guaranteed to work under low light condition. | |
| A pan-tilt unit (PTU) with stereo infrared cameras | Daytime and nighttime with various weather conditions [ | Several target tracking algorithms have been evaluated and tested on a quadrotor and a fixed-wing aircraft. | It can track target early by using enlarge field of view by PTU. | Low accuracy in case of fix-wing touchdown points and high temperature objects in the background. | ||
| Two NIR camera array system with NIR laser lamp | Daytime and nighttime with different light conditions [ | A NIR laser lamp is fixed on the nose of the UAV for easy detection. | A wide baseline camera array-based method was proposed to achieve high precision for calibration and localization results. | It is not practical to be used in the narrow landing area. | ||
| Active methods | Without marker | Visible light camera | Daytime and nighttime with guiding lamp [ | Image binarization and Hough transform used to extract the runway border line for the landing in flight-gear flight simulator. | A simple algorithm is used for border-line detection of runway. | Guiding lamps are required at nighttime, which make it difficult to be used in various places. |
| Day time [ | Two-stage processing procedure to find all possible landing areas and select the best one using the naive Bayesian classifier. | Without the marker, drone can find the landing site in emergency case. | Experiments were not performed in various places and time. In addition, system is evaluated on a Mac Pro laptop computer. | |||
| With marker | Thermal camera | Daytime, and nighttime [ | Using letter-based marker emitting FIR light, feature points can be extracted from the marker so that the drone can perform translation or rotation movements in order to perform safety landing at the desired location. | Using the thermal image, marker detection can be less affected by illumination, time, and environmental change. | A costly thermal camera should be used in drone, and this cannot be used in conventional drone systems, including the use of only visible-light camera. | |
| Visible light camera | Daytime [ | Detecting marker by contour, circle detector or key points descriptor based on SURF, etc. | Marker detection is possible using conventional visible light camera in drone. | Marker is detected only during daytime. | ||
| Daytime, and nighttime ( | Using real-time marker-based tracking algorithm tested on an onboard system having low processing power. | Marker detection method can be operated both in daytime and nighttime. | A specific marker is required for the proposed method. | |||
Figure 1The two coordinates systems of drone and world (a) before changing the yaw of drone, (b) after changing the yaw of drone.
Figure 2Our proposed design of marker for drone landing.
Figure 3Flowchart of proposed marker-based tracking algorithm during daytime.
Figure 4Predicted center and direction using our proposed method. Dark blue and red points respectively represent the positions detected by ATM and profile checker algorithm.
Figure 5Binarized profile extracted from the measured circle of Figure 4.
Figure 6Our profile checker algorithm used to find the accurate center of the marker. Dark blue point is the incorrect center detected by the ATM algorithm, and the red point is the correct one detected by the profile checker algorithm.
Figure 7Strategy to predict marker center when the drone is close to the marker. Using the detected center (dark blue point) obtained by the ATM algorithm as the input for the Kalman Filter in the case where the profile checker algorithm fails, we obtained the corrected one (green point) as the final prediction. The direction of the marker is again estimated using the profile checker algorithm with the final center.
Figure 8Flowchart of proposed marker-based tracking algorithm during night time.
Figure 9Our proposed marker segmentation results: (a) original image; (b) after applying adaptive thresholding; (c) after applying Hit and Miss morphology (d) final result after dilation.
Figure 10Detected center and direction of marker in (a) result image using our proposed method. (b) Profile visualization from the circle of Figure 10a.
Figure 11Our custom built embedded system.
Description of DDroneC-DB1.
| Kinds of Sub-Database | Time | Condition | Description |
|---|---|---|---|
| Sub-database 1 (drone landing) | Morning | Humidity: 41.5%, wind speed: 1.4 m/s, temperature: 8.6 °C, spring, sunny, | A sunny day with clear sky, which has affected the illumination on the marker |
| Afternoon | Humidity: 73.8%, wind speed: 2 m/s, temperature: −2.5 °C, winter, cloudy, | Low level of illumination observed in the winter time, which affected the intensity of background area. | |
| Evening | Humidity: 38.4%, wind speed: 3.5 m/s, temperature: 3.5 °C, winter, windy, | There is the change in the marker’s position due to strong wind | |
| Night | Humidity: 37.5%, wind speed: 3.2 m/s, temperature: 6.9 °C, spring, foggy, | Marker cannot be seen owning low level of light at dark night | |
| Sub-database 2 (drone hovering) | Morning | Humidity: 41.6%, wind speed: 2.5 m/s, temperature: 11 °C, spring, foggy, | Drone hovers above the marker, and the marker is manually moved and rotated while capturing videos. |
| Afternoon | Humidity: 43.5%, wind speed: 2.8 m/s, temperature: 13 °C, spring, sunny, | ||
| Evening | Humidity: 42.9%, wind speed: 2.9 m/s, temperature: 10 °C, spring, | ||
| Night | Humidity: 41.5%, wind speed: 3.1 m/s, temperature: 6 °C, spring, dark night, |
Comparisons of average CLE obtained our method with those obtained by other methods (pixels).
| Categories | Sequence | Ours without KF | Ours with KF | ATM | MIL [ | TLD [ | Median Flow [ | KCF [ |
|---|---|---|---|---|---|---|---|---|
| Sub-database 1 | Morning | 3.32 | 3.26 | 63.98 | 4.29 | 103.3 | 83.45 | 31.03 |
| Afternoon | 2.91 | 2.86 | 15.69 | 5.58 | 58.01 | 92.21 | 13.86 | |
| Evening | 3.89 | 3.54 | 28.11 | 8.85 | 75.13 | 95.84 | 7.42 | |
| Night | 8.36 | 12.2 | 65.22 | 28.15 | 48.1 | 23.56 | 31.65 | |
| Sub-database 2 | Morning | 1.98 | 1.94 | 38.19 | 1.92 | 32.34 | 6.11 | 5.58 |
| Afternoon | 2.32 | 2.05 | 32.1 | 5.04 | 25.66 | 4.19 | 3.14 | |
| Evening | 1.74 | 1.68 | 37.99 | 7.8 | 2.98 | 8.08 | 1.73 | |
| Night | 7.12 | 9.75 | 49.78 | 6.33 | 15.94 | 12.37 | 7.45 |
Comparisons of average PDE obtained by our method with those obtained by other methods (degrees).
| Categories | Sequence | Ours without KF | Ours with KF | ATM |
|---|---|---|---|---|
| Sub-database 1 | Morning | 0.72 | 0.51 | 50.94 |
| Afternoon | 0.88 | 0.47 | 2.64 | |
| Evening | 1.33 | 1.01 | 17.37 | |
| Night | 4.39 | 5.9 | 80.56 | |
| Sub-database 2 | Morning | 2.45 | 1.97 | 26.84 |
| Afternoon | 2.79 | 2.17 | 43.65 | |
| Evening | 2.28 | 1.83 | 27.16 | |
| Night | 3.35 | 4.12 | 68.17 |
Figure 12Comparisons of CLE and PDE obtained by our method with those obtained by other methods using sub-database 1 including the videos captured (a) in the morning, (b) in the afternoon, (c) in the evening, and (d) at night.
Figure 13Comparisons of CLE and PDE obtained by our method with those obtained by other methods using sub-database 2 including the videos captured (a) in the morning, (b) in the afternoon, (c) in the evening, and (d) at night.
Figure 14Marker detection examples obtained using our method and previous methods using sub-database 1 including videos captured (a,b) in the morning, (c,d) in the afternoon, (e,f) in the evening, and (g) at night.
Figure 15Marker detection examples by our method and previous methods using sub-database 2 including the videos captured (a,b) in the morning, (c,d) in the afternoon, (e,f) in the evening, and (g) at night.
Comparisons of average processing time achieved by our proposed method with those obtained by other methods (ms).
| Categories | Sequence | Ours without KF | Ours with KF | MIL [ | TLD [ | Median Flow [ | KCF [ |
|---|---|---|---|---|---|---|---|
| Sub-database 1 | Morning | 22 | 23 | 367 | 2971 | 43 | 359 |
| Afternoon | 22 | 22 | 371 | 2921 | 44 | 258 | |
| Evening | 22 | 23 | 369 | 2192 | 43 | 223 | |
| Night | 24 | 25 | 740 | 3993 | 88 | 180 | |
| Sub-database 2 | Morning | 20 | 20 | 754 | 3129 | 92 | 165 |
| Afternoon | 20 | 22 | 768 | 3427 | 74 | 145 | |
| Evening | 21 | 21 | 762 | 3419 | 72 | 104 | |
| Night | 23 | 25 | 730 | 4530 | 86 | 151 |
Figure 16Some examples of our camera calibration process using chessboard pattern.
Figure 17Experimental setup of our marker and ArUco marker for estimating full pose.
3D coordinates of all detect key points of ArUco marker and our proposed marker.
| Methods | Key Points | X | Y | Z |
|---|---|---|---|---|
| Our marker | P2 |
|
| 0 |
| P4 |
|
| 0 | |
| P9 |
|
| 0 | |
| P13 |
|
| 0 | |
| ArUco marker | A |
|
| 0 |
| B |
|
| 0 | |
| C |
|
| 0 | |
| D |
|
| 0 |
Figure 18Examples of computed pose estimation of our proposed marker compared to ArUco marker in the case of free style flying.
Figure 19Comparison between estimated X, Y, Z translation of our marker and ArUco marker.
Figure 20Comparison between estimated yaw, pitch, roll rotations of our marker and ArUco marker.
Average error of pose estimation in a free flying case scenario.
| Category | Average Error between Our Marker and ArUco Marker-Based Methods |
|---|---|
| X | 0.076 |
| Y | 0.014 |
| Z | 0.095 |
| Yaw | 1.8° |
| Pitch | 1.15° |
| Roll | 2.09° |