Literature DB >> 23549892

Efficient minimum error bounded particle resampling L1 tracker with occlusion detection.

Xue Mei1, Haibin Ling, Yi Wu, Erik P Blasch, Li Bai.   

Abstract

Recently, sparse representation has been applied to visual tracking to find the target with the minimum reconstruction error from a target template subspace. Though effective, these L1 trackers require high computational costs due to numerous calculations for l1 minimization. In addition, the inherent occlusion insensitivity of the l1 minimization has not been fully characterized. In this paper, we propose an efficient L1 tracker, named bounded particle resampling (BPR)-L1 tracker, with a minimum error bound and occlusion detection. First, the minimum error bound is calculated from a linear least squares equation and serves as a guide for particle resampling in a particle filter (PF) framework. Most of the insignificant samples are removed before solving the computationally expensive l1 minimization in a two-step testing. The first step, named τ testing, compares the sample observation likelihood to an ordered set of thresholds to remove insignificant samples without loss of resampling precision. The second step, named max testing, identifies the largest sample probability relative to the target to further remove insignificant samples without altering the tracking result of the current frame. Though sacrificing minimal precision during resampling, max testing achieves significant speed up on top of τ testing. The BPR-L1 technique can also be beneficial to other trackers that have minimum error bounds in a PF framework, especially for trackers based on sparse representations. After the error-bound calculation, BPR-L1 performs occlusion detection by investigating the trivial coefficients in the l1 minimization. These coefficients, by design, contain rich information about image corruptions, including occlusion. Detected occlusions are then used to enhance the template updating. For evaluation, we conduct experiments on three video applications: biometrics (head movement, hand holding object, singers on stage), pedestrians (urban travel, hallway monitoring), and cars in traffic (wide area motion imagery, ground-mounted perspectives). The proposed BPR-L1 method demonstrates an excellent performance as compared with nine state-of-the-art trackers on eleven challenging benchmark sequences.

Mesh:

Year:  2013        PMID: 23549892     DOI: 10.1109/TIP.2013.2255301

Source DB:  PubMed          Journal:  IEEE Trans Image Process        ISSN: 1057-7149            Impact factor:   10.856


  5 in total

1.  Object Tracking Based On Huber Loss Function.

Authors:  Yong Wang; Shiqiang Hu; Shandong Wu
Journal:  Vis Comput       Date:  2018-05-24       Impact factor: 2.601

2.  Online Hierarchical Sparse Representation of Multifeature for Robust Object Tracking.

Authors:  Honghong Yang; Shiru Qu
Journal:  Comput Intell Neurosci       Date:  2016-08-18

3.  EVtracker: An Event-Driven Spatiotemporal Method for Dynamic Object Tracking.

Authors:  Shixiong Zhang; Wenmin Wang; Honglei Li; Shenyong Zhang
Journal:  Sensors (Basel)       Date:  2022-08-15       Impact factor: 3.847

4.  Visual tracking based on extreme learning machine and sparse representation.

Authors:  Baoxian Wang; Linbo Tang; Jinglin Yang; Baojun Zhao; Shuigen Wang
Journal:  Sensors (Basel)       Date:  2015-10-22       Impact factor: 3.576

5.  Computationally Efficient Automatic Coast Mode Target Tracking Based on Occlusion Awareness in Infrared Images.

Authors:  Sohyun Kim; Gwang-Il Jang; Sungho Kim; Junmo Kim
Journal:  Sensors (Basel)       Date:  2018-03-27       Impact factor: 3.576

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.