Corpus ID: 211204922

Second-order Conditional Gradients

@article{Carderera2020SecondorderCG,
  title={Second-order Conditional Gradients},
  author={Alejandro Carderera and Sebastian Pokutta},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.08907}
}
Constrained second-order convex optimization algorithms are the method of choice when a high accuracy solution to a problem is needed, due to the quadratic convergence rates these methods enjoy when close to the optimum. These algorithms require the solution of a constrained quadratic subproblem at every iteration. In the case where the feasible region can only be accessed efficiently through a linear optimization oracle, and computing first-order information about the function, although… Expand
Convex optimization based on global lower second-order models
Practical Frank-Wolfe algorithms

References

SHOWING 1-10 OF 48 REFERENCES
Optimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm
Polyak Steps for Adaptive Fast Gradient Method
Model Function Based Conditional Gradient Method with Armijo-like Line Search
Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
Linear-Memory and Decomposition-Invariant Linearly Convergent Conditional Gradient Algorithm for Structured Polytopes
Practical inexact proximal quasi-Newton method with global complexity analysis
Cubic regularization of Newton method and its global performance
On the Global Linear Convergence of Frank-Wolfe Optimization Variants
...
1
2
3
4
5
...