Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians

@article{Grapiglia2017RegularizedNM,
  title={Regularized Newton Methods for Minimizing Functions with H{\"o}lder Continuous Hessians},
  author={G. N. Grapiglia and Y. Nesterov},
  journal={SIAM J. Optim.},
  year={2017},
  volume={27},
  pages={478-506}
}
In this paper, we study the regularized second-order methods for unconstrained minimization of a twice-differentiable (convex or nonconvex) objective function. For the current function, these methods automatically achieve the best possible global complexity estimates among different Holder classes containing the Hessian of the objective. We show that such methods for functional residual and for the norm of the gradient must be different. For development of the latter methods, we introduced two… Expand
37 Citations
On inexact solution of auxiliary problems in tensor methods for convex optimization
Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method
On Regularization and Active-set Methods with Complexity for Constrained Optimization
On the complexity of solving feasibility problems ∗
Implementable tensor methods in unconstrained convex optimization
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2021
Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems
Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
...
1
2
3
4
...

References

SHOWING 1-10 OF 10 REFERENCES
Universal gradient methods for convex optimization problems
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2015
Smooth minimization of non-smooth functions
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2005
Accelerating the cubic regularization of Newton’s method on convex problems
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2008
Cubic regularization of Newton method and its global performance
Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results
Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
On the p-regularized trust region subproblem