Learning Spatial-Temporal Regularized Correlation Filters for Visual Tracking

  title={Learning Spatial-Temporal Regularized Correlation Filters for Visual Tracking},
  author={Feng Li and Cheng Tian and Wangmeng Zuo and Lei Zhang and Ming-Hsuan Yang},
  journal={2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition},
Discriminative Correlation Filters (DCF) are efficient in visual tracking but suffer from unwanted boundary effects. Spatially Regularized DCF (SRDCF) has been suggested to resolve this issue by enforcing spatial penalty on DCF coefficients, which, inevitably, improves the tracking performance at the price of increasing complexity. To tackle online updating, SRDCF formulates its model on multiple training images, further adding difficulties in improving efficiency. In this work, by introducing… 

Figures and Tables from this paper

Correlation Tracking via Spatial-Temporal Constraints and Structured Sparse Regularization

This paper proposes a novel correlation tracking method via spatial-temporal constraints and structured sparse regularization and introduces the background-aware selection strategy to extract real negative examples, and penalizes the filter coefficients close to the boundary locations for spatial protection.

Fast Learning of Spatially Regularized and Content Aware Correlation Filter for Visual Tracking

A new fast learning approach to content-aware spatial regularization, namely weighted sample based CF tracking (WSCF), which is used to enhance two state-of-the-art CF trackers to significantly boost their tracking accuracy, with little sacrifice on the tracking speed.

Learning an Orientation and Scale Adaptive Tracker With Regularized Correlation Filters

A novel Orientation and Scale adaptive tracker with Regularized Correlation Filters (OSRCF) for visual tracking that outperforms top-ranked methods with handcrafted features in VOT2017 and achieves outstanding performance compared to some state-of-the-art methods in OTB2015.

Learning Multi-feature Based Spatially Regularized and Scale Adaptive Correlation Filters for Visual Tracking

A single-sample-integrated training scheme which utilizes information of the previous frames and the current frame to improve the robustness of training samples is suggested and an alternating direction method of multipliers (ADMM) algorithm is developed to optimize the translation filter.

Learning Spatial-Corrected Regularized Correlation Filters for Visual Tracking

  • Zhaobing YangJing WuC. Long
  • Computer Science
    2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI)
  • 2019
Spatial-Corrected Regularized Correlation Filters (SCRCF) is a DCF-based tracker with a correction mechanism that is more robust to handle some complicated tracking scenes, such as occlusion and motion blur.

Learning Aberrance Repressed and Temporal Regularized Correlation Filters for Visual Tracking

A novel tracking algorithm is presented by introducing a temporal regularization into ARCF tracker, which can not only let the filter template retain the historical information for filter learning, but also achieve a long-time and high precision model, compared with ARCF, which happened in large complex variations.

Learning adaptive spatial-temporal regularized correlation filters for visual tracking

An effective tracking method based on the popular adaptive spatially regularized correlation filter (ASRCF) tracker, which can not only keep the good performances of ASRCF tracker, but also take advantage of the relation of correlation ﰁlters in the last frame and the current frame for addressing the complex cases.

Learning Spatial–Temporal Background-Aware Based Tracking

A spatial–temporal regularization module based on BACF (background-aware correlation filter) framework is proposed, which is performed by introducing a temporal regularization to deal effectively with the boundary effects issue and the accuracy of target recognition is improved.



Learning Spatially Regularized Correlation Filters for Visual Tracking

The proposed SRDCF formulation allows the correlation filters to be learned on a significantly larger set of negative training samples, without corrupting the positive samples, and an optimization strategy is proposed, based on the iterative Gauss-Seidel method, for efficient online learning.

Learning Background-Aware Correlation Filters for Visual Tracking

This work proposes a Background-Aware CF based on hand-crafted features (HOG] that can efficiently model how both the foreground and background of the object varies over time, and superior accuracy and real-time performance of the method compared to the state-of-the-art trackers.

ECO: Efficient Convolution Operators for Tracking

This work revisit the core DCF formulation and introduces a factorized convolution operator, which drastically reduces the number of parameters in the model, and a compact generative model of the training sample distribution that significantly reduces memory and time complexity, while providing better diversity of samples.

Robust and real-time deep tracking via multi-scale domain adaptation

This paper proposes to transfer the feature for image classification to the visual tracking domain via convolutional channel reductions via convolved channel reductions and illustrates the state-of-the-art accuracies in the experiment involving two well-adopted benchmarks with more than 100 test videos.

Structural Correlation Filter for Robust Visual Tracking

The proposed SCF model takes part-based tracking strategies into account in a correlation filter tracker, and exploits circular shifts of all parts for their motion modeling to preserve target object structure.

Integrating Boundary and Center Correlation Filters for Visual Tracking with Aspect Ratio Variation

This paper presents a novel tracking model to integrate 1D Boundary and 2D Center CFs (IBCCF) where boundary and center filters are enforced by a near-orthogonality regularization term and develops an alternating direction method of multipliers to optimize the model.

Target Response Adaptation for Correlation Filter Tracking

This work proposes a generic framework that can adaptively change the target response from frame to frame, so that the tracker is less sensitive to the cases where circular shifts do not reliably approximate translations.

Learning Support Correlation Filters for Visual Tracking

This paper derives an equivalent formulation of a SVM model with the circulant matrix expression and presents an efficient alternating optimization method for visual tracking and extends the SCF-based tracking algorithm with multi-channel features, kernel functions, and scale-adaptive approaches to further improve the tracking performance.

Staple: Complementary Learners for Real-Time Tracking

It is shown that a simple tracker combining complementary cues in a ridge regression framework can operate faster than 80 FPS and outperform not only all entries in the popular VOT14 competition, but also recent and far more sophisticated trackers according to multiple benchmarks.

Large Margin Object Tracking with Circulant Feature Maps

A novel large margin object tracking method which absorbs the strong discriminative ability from structured output SVM and speeds up by the correlation filter algorithm significantly and a multimodal target detection technique is proposed to improve the target localization precision and prevent model drift introduced by similar objects or background noise.