Literature DB >> 33930720

Learning dual-margin model for visual tracking.

Nana Fan1, Xin Li2, Zikun Zhou3, Qiao Liu4, Zhenyu He5.   

Abstract

Existing trackers usually exploit robust features or online updating mechanisms to deal with target variations which is a key challenge in visual tracking. However, the features being robust to variations remain little spatial information, and existing online updating methods are prone to overfitting. In this paper, we propose a dual-margin model for robust and accurate visual tracking. The dual-margin model comprises an intra-object margin between different target appearances and an inter-object margin between the target and the background. The proposed method is able to not only distinguish the target from the background but also perceive the target changes, which tracks target appearance changing and facilitates accurate target state estimation. In addition, to exploit rich off-line video data and learn general rules of target appearance variations, we train the dual-margin model on a large off-line video dataset. We perform tracking under a Siamese framework using the constructed appearance set as templates. The proposed method achieves accurate and robust tracking performance on five public datasets while running in real-time. The favorable performance against the state-of-the-art methods demonstrates the effectiveness of the proposed algorithm.
Copyright © 2021 Elsevier Ltd. All rights reserved.

Keywords:  Dual margin; Siamese network; Visual tracking

Year:  2021        PMID: 33930720     DOI: 10.1016/j.neunet.2021.04.004

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  1 in total

1.  Antiocclusion Visual Tracking Algorithm Combining Fully Convolutional Siamese Network and Correlation Filtering.

Authors:  Xiaomiao Tao; Kaijun Wu; Yongshun Wang; Panfeng Li; Tao Huang; Chenshuai Bai
Journal:  Comput Intell Neurosci       Date:  2022-08-09
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.