# Smooth strongly convex interpolation and exact worst-case performance of first-order methods

@article{Taylor2017SmoothSC, title={Smooth strongly convex interpolation and exact worst-case performance of first-order methods}, author={Adrien B. Taylor and Julien M. Hendrickx and François Glineur}, journal={Mathematical Programming}, year={2017}, volume={161}, pages={307-345} }

We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex programs. Finding the worst-case performance of a black-box first-order method is formulated as an optimization problem over a set of smooth (strongly) convex functions and initial conditions. We develop closed-form necessary and sufficient conditions for smooth (strongly) convex interpolation, which… Expand

#### Figures, Tables, and Topics from this paper

#### 98 Citations

Convex interpolation and performance estimation of first-order methods for convex optimization

- Computer Science, Mathematics
- 2017

This thesis forms a generic optimization problem looking for the worst-case scenarios of first-order methods in convex optimization, and transforms PEPs into solvable finite-dimensional semidefinite programs, from which one obtains worst- Case guarantees and worst- case functions, along with the corresponding explicit proofs. Expand

Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization

- Mathematics, Computer Science
- SIAM J. Optim.
- 2017

A new analytical worst-case guarantee is presented for the proximal point algorithm that is twice better than previously known, and the standard worst- case guarantee for the conditional gradient method is improved by more than a factor of two. Expand

Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization

- Mathematics, Computer Science
- J. Optim. Theory Appl.
- 2018

We study the worst-case convergence rates of the proximal gradient method for minimizing the sum of a smooth strongly convex function and a non-smooth convex function, whose proximal operator is… Expand

An optimal gradient method for smooth (possibly strongly) convex minimization

- Computer Science
- ArXiv
- 2021

We present an optimal gradient method for smooth (possibly strongly) convex optimization. The method is optimal in the sense that its worst-case bound exactly matches the lower bound on the oracle… Expand

On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions

- Mathematics, Computer Science
- Optim. Lett.
- 2017

The tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance is given. Expand

Efficient first-order methods for convex minimization: a constructive approach

- Mathematics, Computer Science
- Math. Program.
- 2020

We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convex… Expand

Performance estimation toolbox (PESTO): Automated worst-case analysis of first-order optimization methods

- Computer Science
- 2017 IEEE 56th Annual Conference on Decision and Control (CDC)
- 2017

A Matlab toolbox that automatically computes tight worst-case performance guarantees for a broad class of first-order methods for convex optimization, which includes those performing explicit, projected, proximal, conditional and inexact (sub)gradient steps. Expand

Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems

- Mathematics, Computer Science
- SIAM J. Optim.
- 2018

A unified framework able to certify both exponential and subexponential convergence rates for a wide range of iterative first-order optimization algorithms and construct a family of parameter-dependent nonquadratic Lyapunov functions that can generate convergence rates in addition to proving asymptotic convergence. Expand

Practical Schemes for Finding Near-Stationary Points of Convex Finite-Sums

- Computer Science, Mathematics
- ArXiv
- 2021

This work conducts a systematic study of the algorithmic techniques in finding near-stationary points of convex finite-sums and proposes an adaptively regularized accelerated SVRG variant, which does not require the knowledge of some unknown initial constants and achieves near-optimal complexities. Expand

The Speed-Robustness Trade-Off for First-Order Methods with Additive Gradient Noise

- Mathematics
- 2021

We study the trade-off between convergence rate and sensitivity to stochastic additive gradient noise for first-order optimization methods. Ordinary Gradient Descent (GD) can be made… Expand

#### References

SHOWING 1-10 OF 28 REFERENCES

Performance of first-order methods for smooth convex minimization: a novel approach

- Computer Science, Mathematics
- Math. Program.
- 2014

A novel approach for analyzing the worst-case performance of first-order black-box optimization methods, which focuses on smooth unconstrained convex minimization over the Euclidean space and derives a new and tight analytical bound on its performance. Expand

Double Smoothing Technique for Large-Scale Linearly Constrained Convex Optimization

- Mathematics, Computer Science
- SIAM J. Optim.
- 2012

This paper dualizes the linear constraints, solves the resulting dual problem with a purely dual gradient-type method and shows how to reconstruct an approximate primal solution. Expand

Convex Optimization Theory

- Computer Science
- 2009

An insightful, concise, and rigorous treatment of the basic theory of convex sets and functions in finite dimensions, and the Dual problem the feasible if it is they, and how to relax the hessian matrix in terms of linear programming. Expand

On the Convergence Analysis of the Optimized Gradient Method

- Mathematics, Medicine
- J. Optim. Theory Appl.
- 2017

This paper provides an analytic convergence bound for the primary sequence generated by the optimized gradient method, including the interesting fact that the optimization method has two types of worst-case functions: a piecewise affine-quadratic function and a quadratic function. Expand

Optimized first-order methods for smooth convex minimization

- Computer Science, Mathematics
- Math. Program.
- 2016

We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle (Math Program 145(1–2):451–482, 2014. doi:10.1007/s10107-013-0653-0) recently described… Expand

Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints

- Computer Science, Mathematics
- SIAM J. Optim.
- 2016

A new framework to analyze and design iterative optimization algorithms built on the notion of Integral Quadratic Constraints (IQC) from robust control theory is developed, proving new inequalities about convex functions and providing a version of IQC theory adapted for use by optimization researchers. Expand

Lectures on modern convex optimization - analysis, algorithms, and engineering applications

- Computer Science, Mathematics
- MPS-SIAM series on optimization
- 2001

The authors present the basic theory of state-of-the-art polynomial time interior point methods for linear, conic quadratic, and semidefinite programming as well as their numerous applications in engineering. Expand

Semidefinite Programming

- Computer Science
- SIAM Rev.
- 1996

A survey of the theory and applications of semidefinite programs and an introduction to primaldual interior-point methods for their solution are given. Expand

Finite Convex Integration

- Mathematics
- 2004

Given a solution f of (Int), then any function of type f + K where K is constant is a solution as well. Thus, we consider an additional initial-type condition: fix a couple (x0, x ∗ 0) in the family… Expand

A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems

- Mathematics, Computer Science
- SIAM J. Imaging Sci.
- 2009

A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically. Expand