Corpus ID: 14121549

Convergence Rate of Frank-Wolfe for Non-Convex Objectives

@article{LacosteJulien2016ConvergenceRO,
  title={Convergence Rate of Frank-Wolfe for Non-Convex Objectives},
  author={S. Lacoste-Julien},
  journal={ArXiv},
  year={2016},
  volume={abs/1607.00345}
}
  • S. Lacoste-Julien
  • Published 2016
  • Mathematics, Computer Science
  • ArXiv
  • We give a simple proof that the Frank-Wolfe algorithm obtains a stationary point at a rate of $O(1/\sqrt{t})$ on non-convex objectives with a Lipschitz continuous gradient. Our analysis is affine invariant and is the first, to the best of our knowledge, giving a similar rate to what was already proven for projected gradient methods (though on slightly different measures of stationarity). 

    Topics from this paper.

    Stochastic Frank-Wolfe methods for nonconvex optimization
    • 61
    • Highly Influenced
    • PDF
    Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis
    • 51
    • PDF
    Decentralized Frank–Wolfe Algorithm for Convex and Nonconvex Problems
    • 34
    • PDF
    Complexities in Projection-Free Stochastic Non-convex Minimization
    • 15
    • PDF
    Escaping Saddle Points in Constrained Optimization
    • 22
    • Highly Influenced
    • PDF
    Continuous DR-submodular Maximization: Structure and Algorithms
    • 29
    • Highly Influenced
    • PDF
    Non-convex Conditional Gradient Sliding
    • 10
    • Highly Influenced
    • PDF
    One Sample Stochastic Frank-Wolfe
    • 11
    • PDF

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 16 REFERENCES
    Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization
    • 757
    • Highly Influential
    • PDF
    On the Global Linear Convergence of Frank-Wolfe Optimization Variants
    • 231
    • PDF
    Introductory Lectures on Convex Optimization - A Basic Course
    • 3,968
    • Highly Influential
    • PDF
    Accelerated gradient methods for nonconvex nonlinear and stochastic programming
    • 335
    • PDF
    On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
    • 164
    • Highly Influential
    • PDF
    The Complexity of Large-scale Convex Programming under a Linear Optimization Oracle
    • 87
    • PDF
    Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
    • 213
    • Highly Influential
    • PDF