Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice

@article{Lin2017CatalystAF,
title={Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice},
author={Hongzhou Lin and Julien Mairal and Za{\"i}d Harchaoui},
journal={Journal of Machine Learning Research},
year={2017},
volume={18},
pages={212:1-212:54}
}

Published 2017 in Journal of Machine Learning Research

We introduce a generic scheme for accelerating gradient-based optimization methods in the sense of Nesterov. The approach, called Catalyst, builds upon the inexact accelerated proximal point algorithm for minimizing a convex objective function, and consists of approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. One of the keys to achieve acceleration in theory and in practice is to solve these sub-problems with appropriate accuracy by using the… CONTINUE READING