Oracle complexity of second-order methods for smooth convex optimization

@article{Arjevani2019OracleCO,
  title={Oracle complexity of second-order methods for smooth convex optimization},
  author={Yossi Arjevani and O. Shamir and Ron Shiff},
  journal={Mathematical Programming},
  year={2019},
  pages={1-34}
}
Second-order methods, which utilize gradients as well as Hessians to optimize a given function, are of major importance in mathematical optimization. In this work, we prove tight bounds on the oracle complexity of such methods for smooth convex functions, or equivalently, the worst-case number of iterations required to optimize such functions to a given accuracy. In particular, these bounds indicate when such methods can or cannot improve on gradient-based methods, whose oracle complexity is… Expand
On the oracle complexity of smooth strongly convex minimization
Lower Bounds for Higher-Order Convex Optimization
Near-optimal method for highly smooth convex optimization
Highly smooth minimization of non-smooth problems
Optimal Combination of Tensor Optimization Methods
On the Randomized Complexity of Minimizing a Convex Quadratic Function
Oracle Complexity in Nonsmooth Nonconvex Optimization
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 29 REFERENCES
Oracle Complexity of Second-Order Methods for Finite-Sum Problems
Lower Bounds for Higher-Order Convex Optimization
Accelerating the cubic regularization of Newton’s method on convex problems
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2008
On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
Cubic regularization of Newton method and its global performance
On the Iteration Complexity of Oblivious First-Order Optimization Algorithms
Convex Optimization
Optimal Black-Box Reductions Between Optimization Objectives
...
1
2
3
...