Learn More
We introduce a generic scheme for accelerating first-order optimization methods in the sense of Nesterov, which builds upon a new analysis of the accelerated prox-imal point algorithm. Our approach consists of minimizing a convex objective by approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. This strategy(More)
References A. Agarwal and L. Bottou. A lower bound for the optimization of finite sums. Technical report, Saga: A fast incremental gradient method with support for non-strongly convex composite objectives.regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization. In ICML, 2015. Efficient algorithms for(More)
We propose an approach to accelerate gradient-based optimization algorithms by giving them the ability to exploit curvature information using quasi-Newton update rules. The proposed scheme, called QuickeNing, is generic and can be applied to a large class of first-order methods such as incremental and block-coordinate algorithms; it is also compatible with(More)
  • 1