| Literature DB >> 22303164 |
Liya Duan1, Tao Guan, Bo Yang.
Abstract
Augmented reality (AR) is a field of computer research which deals with the combination of real world and computer generated data. Registration is one of the most difficult problems currently limiting the usability of AR systems. In this paper, we propose a novel natural feature tracking based registration method for AR applications. The proposed method has following advantages: (1) it is simple and efficient, as no man-made markers are needed for both indoor and outdoor AR applications; moreover, it can work with arbitrary geometric shapes including planar, near planar and non planar structures which really enhance the usability of AR systems. (2) Thanks to the reduced SIFT based augmented optical flow tracker, the virtual scene can still be augmented on the specified areas even under the circumstances of occlusion and large changes in viewpoint during the entire process. (3) It is easy to use, because the adaptive classification tree based matching strategy can give us fast and accurate initialization, even when the initial camera is different from the reference image to a large degree. Experimental evaluations validate the performance of the proposed method for online pose tracking and augmentation.Entities:
Keywords: augmented reality; narrow baseline; natural features; registration; scale invariant feature transform; wide baseline
Year: 2009 PMID: 22303164 PMCID: PMC3267212 DOI: 10.3390/s91210097
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1.Update of the patch number of a leaf node. The figure is for illustration purposes. A usable system contains larger numbers of features and training patches.
Figure 2.Influence of the proposed method.
Figure 3.Flowchart of the proposed registration method.
Figure 4.Results of the first indoor experiment.
Figure 5.Results of the second indoor experiment.
Figure 6.Results with planar scene.
Figure 7.Results of the outdoor experiment.
Figure 8.Computation time.
Figure 9.Feature recovering performance.
Figure 10.Reprojection errors.
Figure 11.Comparison with KLT and projective reconstruction based method.
Figure 12.Errors comparison between our method and other methods.