Optimized first-order methods for smooth convex minimization

@article{Kim2016OptimizedFM,
  title={Optimized first-order methods for smooth convex minimization},
  author={Donghwan Kim and Jeffrey A. Fessler},
  journal={Mathematical Programming},
  year={2016},
  volume={159},
  pages={81-107}
}
We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle (Math Program 145(1–2):451–482, 2014. doi:10.1007/s10107-013-0653-0) recently described a numerical method for computing the N-iteration optimal step coefficients in a class of first-order algorithms that includes gradient methods, heavy-ball methods (Polyak in USSR Comput Math Math Phys 4(5):1–17, 1964. doi:10.1016/0041-5553(64)90137-5), and Nesterov’s fast gradient methods (Nesterov… Expand
Efficient first-order methods for convex minimization: a constructive approach
We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convexExpand
An optimal gradient method for smooth (possibly strongly) convex minimization
We present an optimal gradient method for smooth (possibly strongly) convex optimization. The method is optimal in the sense that its worst-case bound exactly matches the lower bound on the oracleExpand
Smooth strongly convex interpolation and exact worst-case performance of first-order methods
We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convexExpand
An optimized first-order method for image restoration
  • D. Kim, J. Fessler
  • Mathematics, Computer Science
  • 2015 IEEE International Conference on Image Processing (ICIP)
  • 2015
TLDR
The convergence analysis of OGM is discussed and its fast convergence on an image restoration problem using a smoothed total variation (TV) regularizer is explored and its extension to nonsmooth convex minimization for image restoration with l1-sparsity regularization is investigated. Expand
Analysis of Optimization Algorithms via Sum-of-Squares
TLDR
A new framework for unifying and systematizing the performance analysis of first-order black-box optimization algorithms for unconstrained convex minimization is introduced, based on sum-of-squares optimization, which allows to introduce a hierarchy of semidefinite programs (SDPs) that give increasingly better convergence bounds for higher levels of the hierarchy. Expand
Part 2. Gradient and Subgradient Methods for Unconstrained Convex Optimization
This note studies (sub)gradient methods for unconstrained convex optimization. Many parts of this note are based on the chapters [1, Chapter 4] [2, Chapter 3,5,8,10] [5, Chapter 9] [14, Chapter 2,3]Expand
Convex interpolation and performance estimation of first-order methods for convex optimization
TLDR
This thesis forms a generic optimization problem looking for the worst-case scenarios of first-order methods in convex optimization, and transforms PEPs into solvable finite-dimensional semidefinite programs, from which one obtains worst- Case guarantees and worst- case functions, along with the corresponding explicit proofs. Expand
Adaptive Restart of the Optimized Gradient Method for Convex Optimization
TLDR
A new first-order method is derived that resembles the optimized gradient method for strongly convex quadratic problems with known function parameters, yielding a linear convergence rate that is faster than that of the analogous version of the fast gradient method. Expand
A Unifying Framework of Accelerated First-Order Approach to Strongly Monotone Variational Inequalities
In this paper, we propose a unifying framework incorporating several momentum-related search directions for solving strongly monotone variational inequalities. The specific combinations of the searchExpand
On the Convergence Analysis of the Optimized Gradient Method
TLDR
This paper provides an analytic convergence bound for the primary sequence generated by the optimized gradient method, including the interesting fact that the optimization method has two types of worst-case functions: a piecewise affine-quadratic function and a quadratic function. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 30 REFERENCES
Performance of first-order methods for smooth convex minimization: a novel approach
TLDR
A novel approach for analyzing the worst-case performance of first-order black-box optimization methods, which focuses on smooth unconstrained convex minimization over the Euclidean space and derives a new and tight analytical bound on its performance. Expand
Smooth strongly convex interpolation and exact worst-case performance of first-order methods
We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convexExpand
An optimized first-order method for image restoration
  • D. Kim, J. Fessler
  • Mathematics, Computer Science
  • 2015 IEEE International Conference on Image Processing (ICIP)
  • 2015
TLDR
The convergence analysis of OGM is discussed and its fast convergence on an image restoration problem using a smoothed total variation (TV) regularizer is explored and its extension to nonsmooth convex minimization for image restoration with l1-sparsity regularization is investigated. Expand
Gradient methods for minimizing composite objective function
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, andExpand
Optimized first-order methods for smooth convex minimization
We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle (Math Program 145(1---2):451---482, 2014. doi:10.1007/s10107-013-0653-0) recently des...
Gradient methods for minimizing composite functions
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2013
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another isExpand
Some methods of speeding up the convergence of iteration methods
Abstract For the solution of the functional equation P (x) = 0 (1) (where P is an operator, usually linear, from B into B, and B is a Banach space) iteration methods are generally used. These consistExpand
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
TLDR
A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically. Expand
Introductory Lectures on Convex Optimization - A Basic Course
TLDR
It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization, and it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. Expand
Lectures on modern convex optimization - analysis, algorithms, and engineering applications
TLDR
The authors present the basic theory of state-of-the-art polynomial time interior point methods for linear, conic quadratic, and semidefinite programming as well as their numerous applications in engineering. Expand
...
1
2
3
...