A Graduated Filter Method for Large Scale Robust Estimation

@article{Le2020AGF,
  title={A Graduated Filter Method for Large Scale Robust Estimation},
  author={Huu Le and Christopher Zach},
  journal={2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2020},
  pages={5558-5567}
}
  • Huu Le, C. Zach
  • Published 2020
  • Computer Science
  • 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Due to the highly non-convex nature of large-scale robust parameter estimation, avoiding poor local minima is challenging in real-world applications where input data is contaminated by a large or unknown fraction of outliers. In this paper, we introduce a novel solver for robust estimation that possesses a strong ability to escape poor local minima. Our algorithm is built upon the class of traditional graduated optimization techniques, which are considered state-of-the-art local methods to… Expand
1 Citations
Escaping Poor Local Minima in Large Scale Robust Estimation
TLDR
Two novel approaches for robust parameter estimation are introduced that utilize the Filter Method and a generalized Majorization Minimization framework with the half-quadratic lifting formulation to obtain a simple yet efficient solver for robust estimation. Expand

References

SHOWING 1-10 OF 35 REFERENCES
Pareto Meets Huber: Efficiently Avoiding Poor Minima in Robust Estimation
TLDR
This paper proposes a novel algorithm relying on multi-objective optimization which has an ability to escape poor local minima that is on par with the best performing algorithms with a faster decrease of the target objective. Expand
An Exact Penalty Method for Locally Convergent Maximum Consensus
  • Huu Le, Tat-Jun Chin, D. Suter
  • Computer Science, Mathematics
  • 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2017
TLDR
This paper develops a Frank-Wolfe algorithm that can deterministically solve the maximum consensus problem and is much more practical on realistic input sizes. Expand
Descending, Lifting or Smoothing: Secrets of Robust Cost Optimization
TLDR
This work identifies three classes of deterministic second-order algorithms able to tackle robust cost optimization: direct approaches that aim to optimize the robust cost directly with a second order method, lifting-based approaches that add so called lifting variables to embed the given robust cost function into a higher dimensional space, and graduated optimization methods that solve a sequence of smoothed cost functions. Expand
Nonlinear image recovery with half-quadratic regularization
TLDR
This approach is based on an auxiliary array and an extended objective function in which the original variables appear quadratically and the auxiliary variables are decoupled, and yields the original function so that the original image estimate can be obtained by joint minimization. Expand
Robust Bundle Adjustment Revisited
  • C. Zach
  • Mathematics, Computer Science
  • ECCV
  • 2014
TLDR
This work addresses robust estimation in the bundle adjustment procedure with a method based on lifting a robust cost function into a higher dimensional representation, and shows how the lifted formulation is efficiently implemented in a Gauss-Newton framework. Expand
On the Link between Gaussian Homotopy Continuation and Convex Envelopes
TLDR
It is proved that Gaussian smoothing emerges from the best affine approximation to Vese’s nonlinear PDE, hence providing the optimal convexification. Expand
Deterministic consensus maximization with biconvex programming
TLDR
This paper proposes an efficient deterministic optimization algorithm for consensus maximization that conducts a deterministic search that forcibly increases the consensus of the initial solution. Expand
Constrained Restoration and the Recovery of Discontinuities
TLDR
The authors examine prior smoothness constraints of a different form, which permit the recovery of discontinuities without introducing auxiliary variables for marking the location of jumps and suspending the constraints in their vicinity. Expand
Bundle Adjustment in the Large
TLDR
The experiments show that truncated Newton methods, when paired with relatively simple preconditioners, offer state of the art performance for large-scale bundle adjustment. Expand
Efficient globally optimal consensus maximisation with tree search
TLDR
It is shown how consensus maximisation for a wide variety of vision tasks can be posed as a tree search problem, which leads to a novel algorithm based on A* search that is several orders of magnitude faster than previous exact methods. Expand
...
1
2
3
4
...