Literature DB >> 26440260

Fusion of range and stereo data for high-resolution scene-modeling.

Georgios D Evangelidis, Miles Hansard, Radu Horaud.   

Abstract

This paper addresses the problem of range-stereo fusion, for the construction of high-resolution depth maps. In particular, we combine low-resolution depth data with high-resolution stereo data, in a maximum a posteriori (MAP) formulation. Unlike existing schemes that build on MRF optimizers, we infer the disparity map from a series of local energy minimization problems that are solved hierarchically, by growing sparse initial disparities obtained from the depth data. The accuracy of the method is not compromised, owing to three properties of the data-term in the energy function. First, it incorporates a new correlation function that is capable of providing refined correlations and disparities, via subpixel correction. Second, the correlation scores rely on an adaptive cost aggregation step, based on the depth data. Third, the stereo and depth likelihoods are adaptively fused, based on the scene texture and camera geometry. These properties lead to a more selective growing process which, unlike previous seed-growing methods, avoids the tendency to propagate incorrect disparities. The proposed method gives rise to an intrinsically efficient algorithm, which runs at 3FPS on 2.0 MP images on a standard desktop computer. The strong performance of the new method is established both by quantitative comparisons with state-of-the-art methods, and by qualitative comparisons using real depth-stereo data-sets.

Entities:  

Year:  2015        PMID: 26440260     DOI: 10.1109/TPAMI.2015.2400465

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  1 in total

1.  A SIFT-Based DEM Extraction Approach Using GEOEYE-1 Satellite Stereo Pairs.

Authors:  Ioannis N Daliakopoulos; Ioannis K Tsanis
Journal:  Sensors (Basel)       Date:  2019-03-05       Impact factor: 3.576

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.