Contracting Proximal Methods for Smooth Convex Optimization

@article{Doikov2020ContractingPM,
  title={Contracting Proximal Methods for Smooth Convex Optimization},
  author={Nikita Doikov and Y. Nesterov},
  journal={SIAM J. Optim.},
  year={2020},
  volume={30},
  pages={3146-3169}
}
In this paper, we propose new accelerated methods for smooth Convex Optimization, called Contracting Proximal Methods. At every step of these methods, we need to minimize a contracted version of the objective function augmented by a regularization term in the form of Bregman divergence. We provide global convergence analysis for a general scheme admitting inexactness in solving the auxiliary subproblem. In the case of using for this purpose high-order Tensor Methods, we demonstrate an… Expand

Figures and Tables from this paper

Affine-invariant contracting-point methods for Convex Optimization
Optimization Methods for Fully Composite Problems
Inexact Tensor Methods and Their Application to Stochastic Convex Optimization
Accelerated meta-algorithm for convex optimization
On the Computational Efficiency of Catalyst Accelerated Coordinate Descent
...
1
2
...

References

SHOWING 1-10 OF 22 REFERENCES
Local convergence of tensor methods
Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method
Gradient methods for minimizing composite functions
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2013
Accelerating the cubic regularization of Newton’s method on convex problems
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2008
Inexact and accelerated proximal point algorithms
A Universal Catalyst for First-Order Optimization
A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
...
1
2
3
...