Corpus ID: 11610834

Catalyst Acceleration for Gradient-Based Non-Convex Optimization

@inproceedings{Paquette2017CatalystAF,
  title={Catalyst Acceleration for Gradient-Based Non-Convex Optimization},
  author={Courtney Paquette and Hongzhou Lin and Dmitriy Drusvyatskiy and Julien Mairal and Zaid Harchaoui},
  year={2017}
}
  • Courtney Paquette, Hongzhou Lin, +2 authors Zaid Harchaoui
  • Published 2017
  • Mathematics
  • We introduce a generic scheme to solve nonconvex optimization problems using gradient-based algorithms originally designed for minimizing convex functions. When the objective is convex, the proposed approach enjoys the same properties as the Catalyst approach of Lin et al, 2015. When the objective is nonconvex, it achieves the best known convergence rate to stationary points for first-order methods. Specifically, the proposed algorithm does not require knowledge about the convexity of the… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Figures from this paper.

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 26 CITATIONS

    Catalyst for Gradient-based Nonconvex Optimization

    VIEW 7 EXCERPTS
    CITES BACKGROUND, METHODS & RESULTS

    A Generic Acceleration Framework for Stochastic Composite Optimization

    VIEW 1 EXCERPT
    CITES METHODS

    Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems

    VIEW 1 EXCERPT
    CITES BACKGROUND

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 39 REFERENCES

    Generalized Uniformly Optimal Methods for Nonlinear Programming

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Accelerating Stochastic Gradient Descent using Predictive Variance Reduction

    VIEW 14 EXCERPTS
    HIGHLY INFLUENTIAL

    Introductory Lectures on Convex Optimization - A Basic Course

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    An optimal randomized incremental gradient method