Performance of first-order methods for smooth convex minimization: a novel approach

@article{Drori2014PerformanceOF,
  title={Performance of first-order methods for smooth convex minimization: a novel approach},
  author={Yoel Drori and Marc Teboulle},
  journal={Mathematical Programming},
  year={2014},
  volume={145},
  pages={451-482}
}
  • Y. Drori, M. Teboulle
  • Published 14 June 2012
  • Computer Science, Mathematics
  • Mathematical Programming
We introduce a novel approach for analyzing the worst-case performance of first-order black-box optimization methods. We focus on smooth unconstrained convex minimization over the Euclidean space. Our approach relies on the observation that by definition, the worst-case behavior of a black-box optimization method is by itself an optimization problem, which we call the performance estimation problem (PEP). We formulate and analyze the PEP for two classes of first-order algorithms. We first apply… Expand
Efficient first-order methods for convex minimization: a constructive approach
We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convexExpand
Convex interpolation and performance estimation of first-order methods for convex optimization
TLDR
This thesis forms a generic optimization problem looking for the worst-case scenarios of first-order methods in convex optimization, and transforms PEPs into solvable finite-dimensional semidefinite programs, from which one obtains worst- Case guarantees and worst- case functions, along with the corresponding explicit proofs. Expand
Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
TLDR
A new analytical worst-case guarantee is presented for the proximal point algorithm that is twice better than previously known, and the standard worst- case guarantee for the conditional gradient method is improved by more than a factor of two. Expand
Smooth strongly convex interpolation and exact worst-case performance of first-order methods
We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convexExpand
An optimal gradient method for smooth (possibly strongly) convex minimization
We present an optimal gradient method for smooth (possibly strongly) convex optimization. The method is optimal in the sense that its worst-case bound exactly matches the lower bound on the oracleExpand
Performance estimation toolbox (PESTO): Automated worst-case analysis of first-order optimization methods
TLDR
A Matlab toolbox that automatically computes tight worst-case performance guarantees for a broad class of first-order methods for convex optimization, which includes those performing explicit, projected, proximal, conditional and inexact (sub)gradient steps. Expand
Analysis of Optimization Algorithms via Sum-of-Squares
TLDR
A new framework for unifying and systematizing the performance analysis of first-order black-box optimization algorithms for unconstrained convex minimization is introduced, based on sum-of-squares optimization, which allows to introduce a hierarchy of semidefinite programs (SDPs) that give increasingly better convergence bounds for higher levels of the hierarchy. Expand
Worst-case convergence analysis of gradient and Newton methods through semidefinite programming performance estimation
We provide new tools for worst-case performance analysis of the gradient (or steepest descent) method of Cauchy for smooth strongly convex functions, and Newton's method for self-concordantExpand
Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization
We study the worst-case convergence rates of the proximal gradient method for minimizing the sum of a smooth strongly convex function and a non-smooth convex function, whose proximal operator isExpand
Potential-based analyses of first-order methods for constrained and composite optimization
We propose potential-based analyses for first-order algorithms applied to constrained and composite minimization problems. We first propose ``idealized'' frameworks for algorithms in the strongly andExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 27 REFERENCES
Primal-dual first-order methods with O (1/e) iteration-complexity for cone programming.
In this paper we consider the general cone programming problem, and propose primal-dual convex (smooth and/or nonsmooth) minimization reformulations for it. We then discuss first-order methodsExpand
Primal-dual first-order methods with $${\mathcal {O}(1/\epsilon)}$$ iteration-complexity for cone programming
TLDR
This paper discusses first-order methods suitable for solving primal-dual convex and nonsmooth minimization reformulations of the cone programming problem, and proposes a variant of Nesterov’s optimal method which has outperformed the latter one in the authors' computational experiments. Expand
Fine tuning Nesterov’s steepest descent algorithm for differentiable convex programming
TLDR
The first order algorithm for convex programming described by Nesterov in his book is modified, and an adaptive procedure for estimating a strong convexity constant for the function is developed. Expand
Lectures on modern convex optimization - analysis, algorithms, and engineering applications
TLDR
The authors present the basic theory of state-of-the-art polynomial time interior point methods for linear, conic quadratic, and semidefinite programming as well as their numerous applications in engineering. Expand
Introductory Lectures on Convex Optimization - A Basic Course
TLDR
It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization, and it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. Expand
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
TLDR
A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically. Expand
Lectures on modern convex optimization
TLDR
A comprehensive introduction to the subject, this book shows in detail how such problems can be solved in many different fields, and proves the vanishing of a determinant whose ... Expand
Some methods of speeding up the convergence of iteration methods
Abstract For the solution of the functional equation P (x) = 0 (1) (where P is an operator, usually linear, from B into B, and B is a Banach space) iteration methods are generally used. These consistExpand
Quadratic Matrix Programming
  • A. Beck
  • Mathematics, Computer Science
  • SIAM J. Optim.
  • 2007
TLDR
This work constructs a specially devised semidefinite relaxation (SDR) and dual for the QMP problem and shows that under some mild conditions strong duality holds for QMP problems with at most $r$ constraints. Expand
Improved Algorithms for Convex Minimization in Relative Scale
TLDR
Two modifications to Nesterov's algorithms for minimizing convex functions in relative scale are proposed, based on a bisection technique and leads to improved theoretical iteration complexity, and the second is a heuristic for avoiding restarting behavior. Expand
...
1
2
3
...