# Performance of first-order methods for smooth convex minimization: a novel approach

@article{Drori2012PerformanceOF, title={Performance of first-order methods for smooth convex minimization: a novel approach}, author={Yoel Drori and Marc Teboulle}, journal={Mathematical Programming}, year={2012}, volume={145}, pages={451-482} }

We introduce a novel approach for analyzing the worst-case performance of first-order black-box optimization methods. We focus on smooth unconstrained convex minimization over the Euclidean space. Our approach relies on the observation that by definition, the worst-case behavior of a black-box optimization method is by itself an optimization problem, which we call the performance estimation problem (PEP). We formulate and analyze the PEP for two classes of first-order algorithms. We first apply…

## 189 Citations

### Efficient first-order methods for convex minimization: a constructive approach

- Computer ScienceMathematical Programming
- 2019

We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convex…

### Convex interpolation and performance estimation of first-order methods for convex optimization

- Computer Science, Mathematics
- 2017

This thesis forms a generic optimization problem looking for the worst-case scenarios of first-order methods in convex optimization, and transforms PEPs into solvable finite-dimensional semidefinite programs, from which one obtains worst- Case guarantees and worst- case functions, along with the corresponding explicit proofs.

### Optimal first-order methods for convex functions with a quadratic upper bound

- Computer Science, Mathematics
- 2022

This study analyzes worst-case convergence guarantees of ﬁrst-order optimization methods over a function class extending that of smooth and convex functions, and shows how the analysis can be leveraged to obtain convergence guarantees over more complex classes of functions.

### Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization

- Computer ScienceSIAM J. Optim.
- 2017

A new analytical worst-case guarantee is presented for the proximal point algorithm that is twice better than previously known, and the standard worst- case guarantee for the conditional gradient method is improved by more than a factor of two.

### Smooth strongly convex interpolation and exact worst-case performance of first-order methods

- Mathematics, Computer ScienceMath. Program.
- 2017

We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex…

### Smooth strongly convex interpolation and exact worst-case performance of first-order methods

- Mathematics, Computer ScienceMathematical Programming
- 2016

We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex…

### An optimal gradient method for smooth (possibly strongly) convex minimization

- Computer Science, MathematicsArXiv
- 2021

We present an optimal gradient method for smooth (possibly strongly) convex optimization. The method is optimal in the sense that its worst-case bound exactly matches the lower bound on the oracle…

### Analysis of Optimization Algorithms via Sum-of-Squares

- Computer ScienceJournal of Optimization Theory and Applications
- 2021

The SOS framework provides a promising new approach for certifying improved rates of convergence by means of higher-order SOS certificates and is used to derive new results for noisy gradient descent with inexact line search methods.

### Performance estimation toolbox (PESTO): Automated worst-case analysis of first-order optimization methods

- Computer Science2017 IEEE 56th Annual Conference on Decision and Control (CDC)
- 2017

A Matlab toolbox that automatically computes tight worst-case performance guarantees for a broad class of first-order methods for convex optimization, which includes those performing explicit, projected, proximal, conditional and inexact (sub)gradient steps.

### Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization

- Computer ScienceJ. Optim. Theory Appl.
- 2018

We study the worst-case convergence rates of the proximal gradient method for minimizing the sum of a smooth strongly convex function and a non-smooth convex function, whose proximal operator is…

## References

SHOWING 1-10 OF 25 REFERENCES

### Primal-dual first-order methods with O (1/e) iteration-complexity for cone programming.

- Computer Science, Mathematics
- 2011

First-order methods suitable for solving primal-dual convex (smooth and/or nonsmooth) minimization reformulations of the cone programming problem are discussed, and a variant of Nesterov's optimal method is proposed which has outperformed the latter one in computational experiments.

### Primal-dual first-order methods with $${\mathcal {O}(1/\epsilon)}$$ iteration-complexity for cone programming

- Computer Science, MathematicsMath. Program.
- 2011

This paper discusses first-order methods suitable for solving primal-dual convex and nonsmooth minimization reformulations of the cone programming problem, and proposes a variant of Nesterov’s optimal method which has outperformed the latter one in the authors' computational experiments.

### Fine tuning Nesterov’s steepest descent algorithm for differentiable convex programming

- Computer ScienceMath. Program.
- 2013

The first order algorithm for convex programming described by Nesterov in his book is modified, and an adaptive procedure for estimating a strong convexity constant for the function is developed.

### Lectures on modern convex optimization - analysis, algorithms, and engineering applications

- Computer ScienceMPS-SIAM series on optimization
- 2001

The authors present the basic theory of state-of-the-art polynomial time interior point methods for linear, conic quadratic, and semidefinite programming as well as their numerous applications in engineering.

### A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems

- Computer Science, MathematicsSIAM J. Imaging Sci.
- 2009

A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.

### Lectures on modern convex optimization

- Computer Science
- 1987

A comprehensive introduction to the subject, this book shows in detail how such problems can be solved in many different fields, and proves the vanishing of a determinant whose ...

### Quadratic Matrix Programming

- MathematicsSIAM J. Optim.
- 2007

This work constructs a specially devised semidefinite relaxation (SDR) and dual for the QMP problem and shows that under some mild conditions strong duality holds for QMP problems with at most $r$ constraints.

### Improved Algorithms for Convex Minimization in Relative Scale

- Computer ScienceSIAM J. Optim.
- 2011

Two modifications to Nesterov's algorithms for minimizing convex functions in relative scale are proposed, based on a bisection technique and leads to improved theoretical iteration complexity, and the second is a heuristic for avoiding restarting behavior.

### THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM

- Mathematics
- 2000

Let H be a real Hilbert space and Φ:H ↦ R a continuously differentiable function, whose gradient is Lipschitz continuous on bounded sets. We study the nonlinear dissipative dynamical system: ${\ddot…