Literature DB >> 26353130

Object Tracking Benchmark.

Yi Wu, Jongwoo Lim, Ming-Hsuan Yang.   

Abstract

Object tracking has been one of the most important and active research areas in the field of computer vision. A large number of tracking algorithms have been proposed in recent years with demonstrated success. However, the set of sequences used for evaluation is often not sufficient or is sometimes biased for certain types of algorithms. Many datasets do not have common ground-truth object positions or extents, and this makes comparisons among the reported quantitative results difficult. In addition, the initial conditions or parameters of the evaluated tracking algorithms are not the same, and thus, the quantitative results reported in literature are incomparable or sometimes contradictory. To address these issues, we carry out an extensive evaluation of the state-of-the-art online object-tracking algorithms with various evaluation criteria to understand how these methods perform within the same framework. In this work, we first construct a large dataset with ground-truth object positions and extents for tracking and introduce the sequence attributes for the performance analysis. Second, we integrate most of the publicly available trackers into one code library with uniform input and output formats to facilitate large-scale performance evaluation. Third, we extensively evaluate the performance of 31 algorithms on 100 sequences with different initialization settings. By analyzing the quantitative results, we identify effective approaches for robust tracking and provide potential future research directions in this field.

Year:  2015        PMID: 26353130     DOI: 10.1109/TPAMI.2014.2388226

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  40 in total

1.  Learning Enhanced Feature Responses for Visual Object Tracking.

Authors:  Runqing Zhang; Chunxiao Fan; Yue Ming
Journal:  Comput Intell Neurosci       Date:  2022-02-08

2.  A practical evaluation of correlation filter-based object trackers with new features.

Authors:  Islam Mohamed; Ibrahim Elhenawy; Ahmed W Sallam; Andrew Gatt; Ahmad Salah
Journal:  PLoS One       Date:  2022-08-25       Impact factor: 3.752

3.  Pixel-Level and Robust Vibration Source Sensing in High-Frame-Rate Video Analysis.

Authors:  Mingjun Jiang; Tadayoshi Aoyama; Takeshi Takaki; Idaku Ishii
Journal:  Sensors (Basel)       Date:  2016-11-02       Impact factor: 3.576

4.  Visual Detection and Tracking System for a Spherical Amphibious Robot.

Authors:  Shuxiang Guo; Shaowu Pan; Liwei Shi; Ping Guo; Yanlin He; Kun Tang
Journal:  Sensors (Basel)       Date:  2017-04-15       Impact factor: 3.576

5.  Multi-Complementary Model for Long-Term Tracking.

Authors:  Deng Zhang; Junchang Zhang; Chenyang Xia
Journal:  Sensors (Basel)       Date:  2018-02-09       Impact factor: 3.576

6.  Alleviate Similar Object in Visual Tracking via Online Learning Interference-Target Spatial Structure.

Authors:  Guokai Shi; Tingfa Xu; Jiqiang Luo; Jie Guo; Zishu Zhao
Journal:  Sensors (Basel)       Date:  2017-10-19       Impact factor: 3.576

7.  Visual object tracking challenges revisited: VOT vs. OTB.

Authors:  Sun Bei; Zuo Zhen; Luo Wusheng; Du Liebo; Lu Qin
Journal:  PLoS One       Date:  2018-09-27       Impact factor: 3.240

8.  Online Model Updating and Dynamic Learning Rate-Based Robust Object Tracking.

Authors:  Md Mojahidul Islam; Guoqing Hu; Qianbo Liu
Journal:  Sensors (Basel)       Date:  2018-06-26       Impact factor: 3.576

9.  Adaptive Correlation Model for Visual Tracking Using Keypoints Matching and Deep Convolutional Feature.

Authors:  Yuankun Li; Tingfa Xu; Honggao Deng; Guokai Shi; Jie Guo
Journal:  Sensors (Basel)       Date:  2018-02-23       Impact factor: 3.576

10.  Visual Tracking via Deep Feature Fusion and Correlation Filters.

Authors:  Haoran Xia; Yuanping Zhang; Ming Yang; And Yufang Zhao
Journal:  Sensors (Basel)       Date:  2020-06-14       Impact factor: 3.576

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.