CLC number: TP391
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 2020-09-02
Cited: 0
Clicked: 5455
Citations: Bibtex RefMan EndNote GB/T7714
Shui-wang Li, Qian-bo Jiang, Qi-jun Zhao, Li Lu, Zi-liang Feng. Asymmetric discriminative correlation filters for visual tracking[J]. Frontiers of Information Technology & Electronic Engineering, 2020, 21(10): 1467-1484.
@article{title="Asymmetric discriminative correlation filters for visual tracking",
author="Shui-wang Li, Qian-bo Jiang, Qi-jun Zhao, Li Lu, Zi-liang Feng",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="21",
number="10",
pages="1467-1484",
year="2020",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.1900507"
}
%0 Journal Article
%T Asymmetric discriminative correlation filters for visual tracking
%A Shui-wang Li
%A Qian-bo Jiang
%A Qi-jun Zhao
%A Li Lu
%A Zi-liang Feng
%J Frontiers of Information Technology & Electronic Engineering
%V 21
%N 10
%P 1467-1484
%@ 2095-9184
%D 2020
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.1900507
TY - JOUR
T1 - Asymmetric discriminative correlation filters for visual tracking
A1 - Shui-wang Li
A1 - Qian-bo Jiang
A1 - Qi-jun Zhao
A1 - Li Lu
A1 - Zi-liang Feng
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 21
IS - 10
SP - 1467
EP - 1484
%@ 2095-9184
Y1 - 2020
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.1900507
Abstract: Discriminative correlation filters (DCF) are efficient in visual tracking and have advanced the field significantly. However, the symmetry of correlation (or convolution) operator results in computational problems and does harm to the generalized translation equivariance. The former problem has been approached in many ways, whereas the latter one has not been well recognized. In this paper, we analyze the problems with the symmetry of circular convolution and propose an asymmetric one, which as a generalization of the former has a weak generalized translation equivariance property. With this operator, we propose a tracker called the asymmetric discriminative correlation filter (ADCF), which is more sensitive to translations of targets. Its asymmetry allows the filter and the samples to have different sizes. This flexibility makes the computational complexity of ADCF more controllable in the sense that the number of filter parameters will not grow with the sample size. Moreover, the normal matrix of ADCF is a block matrix with each block being a two-level block Toeplitz matrix. With this well-structured normal matrix, we design an algorithm for multiplying an N×N two-level block Toeplitz matrix by a vector with time complexity O(NlogN) and space complexity O(N), instead of O(N2). Unlike DCF-based trackers, introducing spatial or temporal regularization does not increase the essential computational complexity of ADCF. Comparative experiments are performed on a synthetic dataset and four benchmarks, including OTB-2013, OTB-2015, VOT-2016, and Temple-Color, and the results show that our method achieves state-of-the-art visual tracking performance.
[1]Bertinetto L, Valmadre J, Golodetz S, et al., 2016. Staple: complementary learners for real-time tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.1401-1409.
[2]Bolme DS, Beveridge JR, Draper BA, et al., 2010. Visual object tracking using adaptive correlation filters. Proc IEEE Computer Society Conf on Computer Vision and Pattern Recognition, p.2544-2550.
[3]Chen BY, Wang D, Li PX, et al., 2018. Real-time ‘actor-critic’ tracking. Proc 15th European Conf on Computer Vision, p.318-334.
[4]Choi J, Chang HJ, Yun S, et al., 2017. Attentional correlation filter network for adaptive visual tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.4828-4837.
[5]Dalal N, Triggs B, 2005. Histograms of oriented gradients for human detection. Proc IEEE Computer Society Conf on Computer Vision and Pattern Recognition, p.886-893.
[6]Danelljan M, Khan FS, Felsberg M, et al., 2014. Adaptive color attributes for real-time visual tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.1090-1097.
[7]Danelljan M, Häger G, Khan FS, et al., 2015. Learning spatially regularized correlation filters for visual tracking. Proc IEEE Int Conf on Computer Vision, p.4310-4318.
[8]Danelljan M, Häger G, Khan FS, et al., 2016a. Adaptive decontamination of the training set: a unified formulation for discriminative visual tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.1430-1438.
[9]Danelljan M, Robinson A, Khan FS, et al., 2016b. Beyond correlation filters: learning continuous convolution operators for visual tracking. Proc 14th European Conf on Computer Vision, p.472-488.
[10]Danelljan M, Häger G, Khan FS, et al., 2017a. Discriminative scale space tracking. IEEE Trans Patt Anal Mach Intell, 39(8):1561-1575.
[11]Danelljan M, Bhat G, Khan FS, et al., 2017b. ECO: efficient convolution operators for tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.6638-6646.
[12]Dong XP, Shen JB, 2018. Triplet loss in Siamese network for object tracking. Proc 15th European Conf on Computer Vision, p.459-474.
[13]Galoogahi HK, Sim T, Lucey S, 2013. Multi-channel correlation filters. Proc IEEE Int Conf on Computer Vision, p.3072-3079.
[14]Galoogahi HK, Fagg A, Lucey S, 2017. Learning background-aware correlation filters for visual tracking. Proc IEEE Int Conf on Computer Vision, p.1135-1143.
[15]Henriques JF, Caseiro R, Martins P, et al., 2015. High-speed tracking with kernelized correlation filters. IEEE Trans Patt Anal Mach Intell, 37(3):583-596.
[16]Kart U, Lukezic A, Kristan M, et al., 2019. Object tracking by reconstruction with view-specific discriminative correlation filters. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.1339-1348.
[17]Kristan M, Leonardis A, Matas J, et al., 2016. The visual object tracking VOT2016 challenge results. Proc Amsterdam on Computer Vision, p.191-217.
[18]Lee D, 1986. Fast multiplication of a recursive block Toeplitz matrix by a vector and its application. J Complex, 2(4):295-305.
[19]Li B, Yan JJ, Wu W, et al., 2018. High performance visual tracking with Siamese region proposal network. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.8971-8980.
[20]Li F, Tian C, Zuo WM, et al., 2018. Learning spatial-temporal regularized correlation filters for visual tracking. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.4904-4913.
[21]Li Y, Zhu JK, 2014. A scale adaptive kernel correlation filter tracker with feature integration. Proc European Conf on Computer Vision, p.254-265.
[22]Liang PP, Blasch E, Ling HB, 2015. Encoding color information for visual tracking: algorithms and benchmark. IEEE Trans Image Process, 24(12):5630-5644.
[23]Lukezic A, VojíłT, Zajc LC, et al., 2017. Discriminative correlation filter with channel and spatial reliability. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.4847-4856.
[24]Ma C, Huang JB, Yang XK, et al., 2015. Hierarchical convolutional features for visual tracking. Proc IEEE Int Conf on Computer Vision, p.3074-3082.
[25]Mueller M, Smith N, Ghanem B, 2017. Context-aware correlation filter tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.1387-1395.
[26]Nam H, Han B, 2016. Learning multi-domain convolutional neural networks for visual tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.4293-4302.
[27]Pu S, Song Y, Ma C, et al., 2018. Deep attentive tracking via reciprocative learning. Proc 32nd Conf on Neural Information Processing Systems, p.1931-1941.
[28]Qi YK, Zhang SP, Qin L, et al., 2016. Hedged deep tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.4303-4311.
[29]Sun C, Wang D, Lu HC, et al., 2018. Correlation tracking via joint discrimination and reliability learning. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.489-497.
[30]Sun YX, Sun C, Wang D, et al., 2019. ROI pooled correlation filters for visual tracking. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.5783-5791.
[31]Tang M, Yu B, Zhang F, et al., 2018. High-speed tracking with multi-kernel correlation filters. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.4874-4883.
[32]Wang Q, Zhang L, Bertinetto L, et al., 2019. Fast online object tracking and segmentation: a unifying approach. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.1328-1338.
[33]Wu Y, Lim J, Yang MH, 2013. Online object tracking: a benchmark. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.2411-2418.
[34]Wu Y, Lim J, Yang MH, 2015. Object tracking benchmark. IEEE Trans Patt Anal Mach Intell, 37(9):1834-1848.
[35]Zhang JM, Ma SG, Sclaroff S, 2014. MEEM: robust tracking via multiple experts using entropy minimization. Proc 13th European Conf on Computer Vision, p.188-203.
Open peer comments: Debate/Discuss/Question/Opinion
<1>