Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice

@article{Lin2017CatalystAF,
  title={Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice},
  author={Hongzhou Lin and Julien Mairal and Za{\"i}d Harchaoui},
  journal={Journal of Machine Learning Research},
  year={2017},
  volume={18},
  pages={212:1-212:54}
}
We introduce a generic scheme for accelerating gradient-based optimization methods in the sense of Nesterov. The approach, called Catalyst, builds upon the inexact accelerated proximal point algorithm for minimizing a convex objective function, and consists of approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. One of the keys to achieve acceleration in theory and in practice is to solve these sub-problems with appropriate accuracy by using the… CONTINUE READING

References

Publications referenced by this paper.
Showing 1-10 of 50 references

Introductory Lectures on Convex Optimization: A Basic Course

  • Y. Nesterov
  • 2004
Highly Influential
9 Excerpts

Similar Papers

Loading similar papers…