| Literature DB >> 30423879 |
Juan J González-Quiñones1, Juan F Reinoso-Gordo2, Carlos A León-Robles3, José L García-Balboa4, Francisco J Ariza-López5.
Abstract
Point cloud (PC) generation from photogrammetry⁻remotely piloted aircraft systems (RPAS) at high spatial and temporal resolution and accuracy is of increasing importance for many applications. For several years, photogrammetry⁻RPAS has been used to recover civil engineering works such as digital elevation models (DEMs), triangle irregular networks (TINs), contour levels, orthophotographs, etc. This study analyzes the influence of variables involved in the accuracy of PC generation over asphalt shapes and determines the most influential variable based on the development of an artificial neural network (ANN) with patterns identified in the test flights. The input variables were those involved, and output was the three-dimension root mean square error (3D-RMSE) of the PC in each ground control point (GCP). The result of the study shows that the most influential variable over PC accuracy is the modulation transfer function 50 (MTF50). In addition, the study obtained an average 3D-RMSE of 1 cm. The results can be used by the scientific and civil engineering communities to consider MTF50 variables in obtaining images from RPAS cameras and to predict the accuracy of a PC over asphalt based on the ANN developed. Also, this ANN could be the beginning of a large database containing patterns from several cameras and lenses in the world market.Entities:
Keywords: accuracy model; camera; modulation transfer function; neural network; photogrammetry
Year: 2018 PMID: 30423879 PMCID: PMC6263902 DOI: 10.3390/s18113880
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Research workflow. RPAS, remotely piloted aircraft systems; GCP, ground control point; MTF50, modulation transfer function 50.
Figure 2RPAS.
Figure 3Flight test zone.
Set of flight missions. GSD, ground sample distance.
| Flight Mission | Focal Length (mm) | GSD (mm) | Overlap (%) |
|---|---|---|---|
|
| 4 |
|
|
|
| 4 | 7 | 80 |
|
| 4 | 7 | 70 |
|
| 6 | 5 | 90 |
|
| 6 | 5 | 80 |
|
| 6 | 5 | 70 |
|
| 4 | 5 | 90 |
|
| 4 | 5 | 80 |
|
| 4 | 5 | 70 |
|
| 6 | 7 | 90 |
|
| 6 | 7 | 80 |
|
| 6 | 7 | 70 |
|
| 4 | 9.5 | 90 |
|
| 4 | 9.5 | 80 |
|
| 4 | 9.5 | 70 |
Figure 4Development of the point cloud (PC). The first column shows the GCP number, the following three show the real GCP (ground truth), and the last one shows the square error between the PC generated by photogrammetry with respect to the ground truth.
Figure 5Workflow to determine the MTF50 of each photo(unamgraph on vertical and horizontal axes.
Figure 6Artificial neural network inputs–output.
Figure 7(a) Design of the ANN scheme; (b) mean square error (MSE); (c) error histogram shows the errors of each phase of ANN setting (training, validation, and testing); (d) training parameters of the ANN.
Figure 8Influence of each variable on model accuracy.