On the Convergence Analysis of the Optimized Gradient Method

@article{Kim2017OnTC,
  title={On the Convergence Analysis of the Optimized Gradient Method},
  author={Donghwan Kim and Jeffrey A. Fessler},
  journal={Journal of Optimization Theory and Applications},
  year={2017},
  volume={172},
  pages={187-205}
}
  • D. Kim, J. Fessler
  • Published 29 October 2015
  • Mathematics, Medicine, Computer Science
  • Journal of Optimization Theory and Applications
This paper considers the problem of unconstrained minimization of smooth convex functions having Lipschitz continuous gradients with known Lipschitz constant. We recently proposed the optimized gradient method for this problem and showed that it has a worst-case convergence bound for the cost function decrease that is twice as small as that of Nesterov’s fast gradient method, yet has a similarly efficient practical implementation. Drori showed recently that the optimized gradient method has… Expand
Adaptive Restart of the Optimized Gradient Method for Convex Optimization
TLDR
A new first-order method is derived that resembles the optimized gradient method for strongly convex quadratic problems with known function parameters, yielding a linear convergence rate that is faster than that of the analogous version of the fast gradient method. Expand
2 Problem and Methods 2 . 1 Smooth and Strongly Convex Problem
First-order methods with momentum, such as Nesterov’s fast gradient method, are very useful for convex optimization problems, but can exhibit undesirable oscillations yielding slow convergence ratesExpand
Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions
TLDR
This paper optimizes the step coefficients of first-order methods for smooth convex minimization in terms of the worst-case convergence bound of the decrease in the gradient norm, and illustrates that the proposed method has a computationally efficient form that is similar to the optimized gradient method. Expand
Efficient first-order methods for convex minimization: a constructive approach
We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convexExpand
Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
TLDR
A new analytical worst-case guarantee is presented for the proximal point algorithm that is twice better than previously known, and the standard worst- case guarantee for the conditional gradient method is improved by more than a factor of two. Expand
Convex interpolation and performance estimation of first-order methods for convex optimization
TLDR
This thesis forms a generic optimization problem looking for the worst-case scenarios of first-order methods in convex optimization, and transforms PEPs into solvable finite-dimensional semidefinite programs, from which one obtains worst- Case guarantees and worst- case functions, along with the corresponding explicit proofs. Expand
On the Optimal Ergodic Sublinear Convergence Rate of the Relaxed Proximal Point Algorithm for Variational Inequalities.
This paper investigates the optimal ergodic sublinear convergence rate of the relaxed proximal point algorithm for solving monotone variational inequality problems. The exact worst case convergenceExpand
Smooth strongly convex interpolation and exact worst-case performance of first-order methods
We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convexExpand
An optimized first-order method for image restoration
  • D. Kim, J. Fessler
  • Mathematics, Computer Science
  • 2015 IEEE International Conference on Image Processing (ICIP)
  • 2015
TLDR
The convergence analysis of OGM is discussed and its fast convergence on an image restoration problem using a smoothed total variation (TV) regularizer is explored and its extension to nonsmooth convex minimization for image restoration with l1-sparsity regularization is investigated. Expand
A frequency-domain analysis of inexact gradient methods
TLDR
Improved analytic bounds for the convergence rate of Nesterov’s accelerated method on strongly convex functions are derived, based on frequency-domain criteria for the stability of nonlinear systems. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 18 REFERENCES
Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
TLDR
A new analytical worst-case guarantee is presented for the proximal point algorithm that is twice better than previously known, and the standard worst- case guarantee for the conditional gradient method is improved by more than a factor of two. Expand
Performance of first-order methods for smooth convex minimization: a novel approach
TLDR
A novel approach for analyzing the worst-case performance of first-order black-box optimization methods, which focuses on smooth unconstrained convex minimization over the Euclidean space and derives a new and tight analytical bound on its performance. Expand
Gradient methods for minimizing composite functions
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2013
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another isExpand
Smooth strongly convex interpolation and exact worst-case performance of first-order methods
We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convexExpand
Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
TLDR
A new framework to analyze and design iterative optimization algorithms built on the notion of Integral Quadratic Constraints (IQC) from robust control theory is developed, proving new inequalities about convex functions and providing a version of IQC theory adapted for use by optimization researchers. Expand
Optimized first-order methods for smooth convex minimization
We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle (Math Program 145(1–2):451–482, 2014. doi:10.1007/s10107-013-0653-0) recently describedExpand
Introductory Lectures on Convex Optimization - A Basic Course
TLDR
It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization, and it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. Expand
Some methods of speeding up the convergence of iteration methods
Abstract For the solution of the functional equation P (x) = 0 (1) (where P is an operator, usually linear, from B into B, and B is a Banach space) iteration methods are generally used. These consistExpand
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
TLDR
A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically. Expand
The exact information-based complexity of smooth convex minimization
  • Y. Drori
  • Mathematics, Computer Science
  • J. Complex.
  • 2017
We obtain a new lower bound on the information-based complexity of first-order minimization of smooth and convex functions. We show that the bound matches the worst-case performance of the recentlyExpand
...
1
2
...