Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives

@article{Grapiglia2020TensorMF,
  title={Tensor Methods for Minimizing Convex Functions with H{\"o}lder Continuous Higher-Order Derivatives},
  author={G. N. Grapiglia and Y. Nesterov},
  journal={SIAM J. Optim.},
  year={2020},
  volume={30},
  pages={2750-2779}
}
In this paper we study p-order methods for unconstrained minimization of convex functions that are p-times differentiable (p ≥ 2) with n-Holder continuous p th derivatives. We propose tensor schemes with and without acceleration. For the schemes without acceleration, we establish iteration complexity bounds of O(e-1/(p+n-1))$ for reducing the functional residual below a given e I\in (0,1). Assuming that n is known, we obtain an improved complexity bound of O(e-1/(p+n))$ for the corresponding… Expand
4 Citations
Optimal Combination of Tensor Optimization Methods
TLDR
A general framework allowing to obtain near-optimal oracle complexity for each function in the sum separately is proposed, meaning, in particular, that the oracle for a function with lower Lipschitz constant is called a smaller number of times. Expand
Tensor methods for finding approximate stationary points of convex functions
In this paper we consider the problem of finding $\epsilon$-approximate stationary points of convex functions that are $p$-times differentiable with $\nu$-H\"{o}lder continuous $p$th derivatives. WeExpand
Optimization Methods for Fully Composite Problems
In this paper, we propose a new Fully Composite Formulation of convex optimization problems. It includes, as a particular case, the problems with functional constraints, max-type minimizationExpand
Smoothness Parameter of Power of Euclidean Norm
TLDR
This paper proves the Hölder continuity and establishes explicit expressions for the corresponding constants of derivatives of powers of Euclidean norm, and shows that these constants are optimal for odd derivatives and at most two times suboptimal for the even ones. Expand

References

SHOWING 1-10 OF 19 REFERENCES
Implementable tensor methods in unconstrained convex optimization
  • Y. Nesterov
  • Computer Science, Mathematics
  • Math. Program.
  • 2021
TLDR
New tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial, and an efficient technique for solving the auxiliary problem, based on the recently developed relative smoothness condition are developed. Expand
On inexact solution of auxiliary problems in tensor methods for convex optimization
TLDR
This paper studies the auxiliary problems that appear in p-order tensor methods for unconstrained minimization of convex functions with ν-Hölder continuous pth derivatives and proves that the referred methods take at most iterations to find either a suitable approximate stationary point of the tensor model or an ε-approximate stationary points of the original objective function. Expand
Partially separable convexly-constrained optimization with non-Lipschitzian singularities and its complexity
An adaptive regularization algorithm using high-order models is proposed for partially separable convexly constrained nonlinear optimization problems whose objective function containsExpand
Accelerating the cubic regularization of Newton’s method on convex problems
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2008
TLDR
An accelerated version of the cubic regularization of Newton’s method that converges for the same problem class with order, keeping the complexity of each iteration unchanged and arguing that for the second-order schemes, the class of non-degenerate problems is different from the standard class. Expand
Tensor Methods for Unconstrained Optimization Using Second Derivatives
TLDR
It is shown that the costs of forming, storing, and solving the tensor model are not significantly more than these costs for a standard method based upon a quadratic Taylor series model. Expand
Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
In this paper, we study accelerated Regularized Newton Methods for minimizing objectives formed as a sum of two functions: one is convex and twice differentiable with Holder-continuous Hessian, andExpand
Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
TLDR
This paper studied the regularized second-order methods for unconstrained minimization of a twice-differentiable (convex or nonconveX) objective function and introduced two new line-search acceptance criteria, which can be seen as generalizations of the Armijo condition. Expand
Oracle complexity of second-order methods for smooth convex optimization
TLDR
This work proves tight bounds on the oracle complexity of second-order methods for smooth convex functions, or equivalently, the worst-case number of iterations required to optimize such functions to a given accuracy. Expand
Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
TLDR
The worst-case evaluation complexity for smooth (possibly nonconvex) unconstrained optimization is considered and it is shown that an $$epsilon $$ϵ-approximate first-order critical point can be computed in at most O(ϵ-(p+1)/p) evaluations of the problem’s objective function and its derivatives. Expand
Tensor Methods for Large, Sparse Unconstrained Optimization
TLDR
Test results show that tensor methods are significantly more efficient and more reliable than standard methods based on Newton's method for large, sparse unconstrained optimization problems. Expand
...
1
2
...