Corpus ID: 174798380

A Generic Acceleration Framework for Stochastic Composite Optimization

@inproceedings{Kulunchakov2019AGA,
  title={A Generic Acceleration Framework for Stochastic Composite Optimization},
  author={Andrei Kulunchakov and Julien Mairal},
  booktitle={NeurIPS},
  year={2019}
}
  • Andrei Kulunchakov, Julien Mairal
  • Published in NeurIPS 2019
  • Computer Science, Mathematics
  • In this paper, we introduce various mechanisms to obtain accelerated first-order stochastic optimization algorithms when the objective function is convex or strongly convex. Specifically, we extend the Catalyst approach originally designed for deterministic objectives to the stochastic setting. Given an optimization method with mild convergence guarantees for strongly convex problems, the challenge is to accelerate convergence to a noise-dominated region, and then achieve convergence with an… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 58 REFERENCES

    Introductory Lectures on Convex Optimization - A Basic Course

    VIEW 13 EXCERPTS
    HIGHLY INFLUENTIAL

    Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning

    • Julien Mairal
    • Computer Science, Mathematics
    • SIAM Journal on Optimization
    • 2015
    VIEW 8 EXCERPTS

    Katyusha: the first direct acceleration of stochastic gradient methods

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL