| Literature DB >> 29495333 |
Nan Su1, Yiming Yan2, Chunhui Zhao3, Liguo Wang4.
Abstract
In the paper, we propose a novel object-oriented hierarchy radiation consistency method for dense matching of different temporal and different sensor data in the 3D reconstruction. For different temporal images, our illumination consistency method is proposed to solve both the illumination uniformity for a single image and the relative illumination normalization for image pairs. Especially in the relative illumination normalization step, singular value equalization and linear relationship of the invariant pixels is combined used for the initial global illumination normalization and the object-oriented refined illumination normalization in detail, respectively. For different sensor images, we propose the union group sparse method, which is based on improving the original group sparse model. The different sensor images are set to a similar smoothness level by the same threshold of singular value from the union group matrix. Our method comprehensively considered the influence factors on the dense matching of the different temporal and different sensor stereoscopic image pairs to simultaneously improve the illumination consistency and the smoothness consistency. The radiation consistency experimental results verify the effectiveness and superiority of the proposed method by comparing two other methods. Moreover, in the dense matching experiment of the mixed stereoscopic image pairs, our method has more advantages for objects in the urban area.Entities:
Keywords: dense matching in 3D reconstruction; different temporal and different sensor images; illumination consistency; smoothness consistency
Year: 2018 PMID: 29495333 PMCID: PMC5876752 DOI: 10.3390/s18030682
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1The workflow of the proposed method.
Figure 2The proposed smoothness consistency method based on union group sparse in the paper.
Figure 3The comparative experiment results of radiation consistency on the mixed stereoscopic image pairs. (a) The different temporal QuickBird&WorldView-2 image pairs. (b) The different temporal QuickBird&WorldView-2 image pairs. (c) The different temporal BJ-2 image pairs. (Note: (1) The original images, (2) images by Zhang’s method, (3) images by Zhong’s method, (4) images by our method.)
Figure 4Object number.
The quantitative analysis of radiation consistency based on objects.
| The Original Images | Zhang’s Method | Zhong’s Method | Our Method | |
|---|---|---|---|---|
| Object 1 | ||||
| Similarity | 0.748 | 0.750 | 0.763 | |
| Object 2 | ||||
| Similarity | 0.915 | 0.944 | 0.930 | |
| Object 3 | ||||
| Similarity | 0.468 | 0.656 | 0.643 | |
| Object 4 | ||||
| Similarity | 0.907 | 0.973 | 0.966 | |
| Object 5 | ||||
| Similarity | 0.888 | 0.777 | 0.855 | |
| Object 6 | ||||
| Similarity | 0.851 | 0.765 | 0.856 | |
| Object 7 | ||||
| Similarity | 0.814 | 0.769 | 0.835 | |
| Object 8 | ||||
| Similarity | 0.837 | 0.857 | 0.811 | |
| Object 9 | ||||
| Similarity | 0.838 | 0.804 | 0.859 |
Figure 5Comparative experiments of dense matching for the mixed stereoscopic image pairs. (a) The different temporal QuickBird&WorldView-2 image pairs. (b) The different temporal QuickBird&WorldView-2 image pairs. (c) The different temporal BJ-2 image pairs. (Note: (1) Dense matching results by the original images; (2), (3), and (4) correspond to dense matching results by Zhang’s method, Zhong’s method, and by our method, respectively.)
Figure 6The truth image of disparity in object areas by manual.
The correct matching rate in object areas.
| Object Number | Origial Images | Zhang’s Method | Zhong’s Method | Our Method |
|---|---|---|---|---|
| 1 | 0.074 | 0.111 | 0.062 | 0.449 |
| 2 | 0.000 | 0.064 | 0.063 | 0.737 |
| 3 | 0.060 | 0.176 | 0.240 | 0.821 |
| 4 | 0.597 | 0.577 | 0.602 | 0.619 |
| 5 | 0.271 | 0.033 | 0.293 | 0.376 |
| 6 | 0.124 | 0.013 | 0.131 | 0.327 |
| 7 | 0.345 | 0.126 | 0.345 | 0.505 |
| 8 | 0.602 | 0.607 | 0.612 | 0.613 |
| 9 | 0.510 | 0.473 | 0.682 | 0.699 |