Smooth strongly convex interpolation and exact worst-case performance of first-order methods

@article{Taylor2017SmoothSC,
  title={Smooth strongly convex interpolation and exact worst-case performance of first-order methods},
  author={Adrien B. Taylor and Julien M. Hendrickx and François Glineur},
  journal={Mathematical Programming},
  year={2017},
  volume={161},
  pages={307-345}
}
We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex programs. Finding the worst-case performance of a black-box first-order method is formulated as an optimization problem over a set of smooth (strongly) convex functions and initial conditions. We develop closed-form necessary and sufficient conditions for smooth (strongly) convex interpolation, which… Expand
Convex interpolation and performance estimation of first-order methods for convex optimization
TLDR
This thesis forms a generic optimization problem looking for the worst-case scenarios of first-order methods in convex optimization, and transforms PEPs into solvable finite-dimensional semidefinite programs, from which one obtains worst- Case guarantees and worst- case functions, along with the corresponding explicit proofs. Expand
Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
TLDR
A new analytical worst-case guarantee is presented for the proximal point algorithm that is twice better than previously known, and the standard worst- case guarantee for the conditional gradient method is improved by more than a factor of two. Expand
Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization
We study the worst-case convergence rates of the proximal gradient method for minimizing the sum of a smooth strongly convex function and a non-smooth convex function, whose proximal operator isExpand
An optimal gradient method for smooth (possibly strongly) convex minimization
We present an optimal gradient method for smooth (possibly strongly) convex optimization. The method is optimal in the sense that its worst-case bound exactly matches the lower bound on the oracleExpand
On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
TLDR
The tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance is given. Expand
Efficient first-order methods for convex minimization: a constructive approach
We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convexExpand
Performance estimation toolbox (PESTO): Automated worst-case analysis of first-order optimization methods
TLDR
A Matlab toolbox that automatically computes tight worst-case performance guarantees for a broad class of first-order methods for convex optimization, which includes those performing explicit, projected, proximal, conditional and inexact (sub)gradient steps. Expand
Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems
TLDR
A unified framework able to certify both exponential and subexponential convergence rates for a wide range of iterative first-order optimization algorithms and construct a family of parameter-dependent nonquadratic Lyapunov functions that can generate convergence rates in addition to proving asymptotic convergence. Expand
Practical Schemes for Finding Near-Stationary Points of Convex Finite-Sums
TLDR
This work conducts a systematic study of the algorithmic techniques in finding near-stationary points of convex finite-sums and proposes an adaptively regularized accelerated SVRG variant, which does not require the knowledge of some unknown initial constants and achieves near-optimal complexities. Expand
The Speed-Robustness Trade-Off for First-Order Methods with Additive Gradient Noise
We study the trade-off between convergence rate and sensitivity to stochastic additive gradient noise for first-order optimization methods. Ordinary Gradient Descent (GD) can be madeExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 28 REFERENCES
Performance of first-order methods for smooth convex minimization: a novel approach
TLDR
A novel approach for analyzing the worst-case performance of first-order black-box optimization methods, which focuses on smooth unconstrained convex minimization over the Euclidean space and derives a new and tight analytical bound on its performance. Expand
Double Smoothing Technique for Large-Scale Linearly Constrained Convex Optimization
TLDR
This paper dualizes the linear constraints, solves the resulting dual problem with a purely dual gradient-type method and shows how to reconstruct an approximate primal solution. Expand
Convex Optimization Theory
TLDR
An insightful, concise, and rigorous treatment of the basic theory of convex sets and functions in finite dimensions, and the Dual problem the feasible if it is they, and how to relax the hessian matrix in terms of linear programming. Expand
On the Convergence Analysis of the Optimized Gradient Method
TLDR
This paper provides an analytic convergence bound for the primary sequence generated by the optimized gradient method, including the interesting fact that the optimization method has two types of worst-case functions: a piecewise affine-quadratic function and a quadratic function. Expand
Optimized first-order methods for smooth convex minimization
We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle (Math Program 145(1–2):451–482, 2014. doi:10.1007/s10107-013-0653-0) recently describedExpand
Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
TLDR
A new framework to analyze and design iterative optimization algorithms built on the notion of Integral Quadratic Constraints (IQC) from robust control theory is developed, proving new inequalities about convex functions and providing a version of IQC theory adapted for use by optimization researchers. Expand
Lectures on modern convex optimization - analysis, algorithms, and engineering applications
TLDR
The authors present the basic theory of state-of-the-art polynomial time interior point methods for linear, conic quadratic, and semidefinite programming as well as their numerous applications in engineering. Expand
Semidefinite Programming
TLDR
A survey of the theory and applications of semidefinite programs and an introduction to primaldual interior-point methods for their solution are given. Expand
Finite Convex Integration
Given a solution f of (Int), then any function of type f + K where K is constant is a solution as well. Thus, we consider an additional initial-type condition: fix a couple (x0, x ∗ 0) in the familyExpand
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
TLDR
A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically. Expand
...
1
2
3
...