On the Convergence of Block Coordinate Descent Type Methods

@article{Beck2013OnTC,
  title={On the Convergence of Block Coordinate Descent Type Methods},
  author={Amir Beck and Luba Tetruashvili},
  journal={SIAM J. Optim.},
  year={2013},
  volume={23},
  pages={2037-2060}
}
In this paper we study smooth convex programming problems where the decision variables vector is split into several blocks of variables. We analyze the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order. Global sublinear rate of convergence of this method is established and it is shown that it can be accelerated when the problem is unconstrained. In the unconstrained… 

Tables from this paper

The Cyclic Block Conditional Gradient Method for Convex Optimization Problems
TLDR
Numerical comparisons of the proposed method to both the classical conditional gradient algorithm and its random block version demonstrate the effectiveness of the cyclic block update rule.
On the Convergence of a Regularized Jacobi Algorithm for Convex Optimization
TLDR
The convergence analysis of the regularized Jacobi algorithm is revisited and it is shown that it also converges in iterates under very mild conditions on the objective function and achieves a linear convergence rate.
Parallel coordinate descent methods for composite minimization: convergence analysis and error bounds
TLDR
It is shown that the theoretical estimates on the convergence rate depend on the number of blocks chosen randomly and a natural measure of separability of the objective function, and it is proved that the new class of generalized error bound functions encompasses both global/local error bound function and smooth strongly convex functions.
On the convergence of a Block-Coordinate Incremental Gradient method
TLDR
It is proved that the block-coordinate incremental gradient method can be seen as a gradient method with errors and convergence can be proved by showing the error at each iteration satisfies some standard conditions.
Iteration complexity analysis of block coordinate descent methods
TLDR
This paper unify these algorithms under the so-called block successive upper-bound minimization (BSUM) framework, and shows that for a broad class of multi-block nonsmooth convex problems, all algorithms achieve a global sublinear iteration complexity of O(1/r), where r is the iteration index.
The 2-Coordinate Descent Method for Solving Double-Sided Simplex Constrained Minimization Problems
  • A. Beck
  • Mathematics
    J. Optim. Theory Appl.
  • 2014
This paper considers the problem of minimizing a continuously differentiable function with a Lipschitz continuous gradient subject to a single linear equality constraint and additional bound
The Analysis of Alternating Minimization Method for Double Sparsity Constrained Optimization Problem
TLDR
This work analyzes the alternating minimization (AM) method for solving double sparsity constrained minimization problem, where the decision variable vector is split into two blocks and establishes a non-asymptotic sub-linear rate of convergence under the assumption of convexity.
Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
TLDR
A parallel version of a randomized (block) coordinate descent method for minimizing the sum of a partially separable smooth convex function and a fully separable nonsmooth convexfunction has a sublinear convergence rate.
A block coordinate descent method of multipliers: Convergence analysis and applications
TLDR
This work proposes a new class of algorithms called the block coordinate descent method of multipliers (BCDMM) to solve a nonsmooth convex problem with linear coupling constraints and shows that under certain regularity conditions, the BCDMM converges to the set of optimal solutions.
A proximal block minimization method of multipliers with a substitution procedure
TLDR
This paper proposes a new algorithm called the proximal block minimization method of multipliers with a substitution to solve this family of problems of this form and proves its convergence via the analytic framework of contractive type methods and derives a worst-case convergence rate in an ergodic sense.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 24 REFERENCES
Globally convergent block-coordinate techniques for unconstrained optimization
TLDR
New classes of globally convergent block-coordinate techniques for the unconstrained minimization of a continuously differentiable function and line-search-based schemes that may also include partial global inimizations with respect to some component are defined.
On the convergence of the coordinate descent method for convex differentiable minimization
The coordinate descent method enjoys a long history in convex differentiable minimization. Surprisingly, very little is known about the convergence of the iterates generated by this method.
On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
TLDR
This work proves O(1/k) convergence rates for two variants of cyclic coordinate descent under an isotonicity assumption by comparing the objective values attained by the two variants with each other, as well as with the gradient descent algorithm.
Gradient methods for minimizing composite objective function
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and
Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
  • Y. Nesterov
  • Computer Science, Mathematics
    SIAM J. Optim.
  • 2012
TLDR
Surprisingly enough, for certain classes of objective functions, the proposed methods for solving huge-scale optimization problems are better than the standard worst-case bounds for deterministic algorithms.
Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
TLDR
A randomized block-coordinate descent method for minimizing the sum of a smooth and a simple nonsmooth block-separable convex function is developed and it is proved that it obtains an accurate solution with probability at least 1-\rho in at most O(n/\varepsilon) iterations, thus achieving first true iteration complexity bounds.
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
TLDR
A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.
Error bounds and convergence analysis of feasible descent methods: a general approach
TLDR
A general approach to analyzing the convergence and the rate of convergence of feasible descent methods that does not require any nondegeneracy assumption on the problem is surveyed and extended.
Iterative solution of nonlinear equations in several variables
TLDR
Convergence of Minimization Methods An Annotated List of Basic Reference Books Bibliography Author Index Subject Index.
An Introduction to Optimization
  • E. Chong, S. Żak
  • Computer Science
    IEEE Antennas and Propagation Magazine
  • 1996
TLDR
This review discusses mathematics, linear programming, and set--Constrained and Unconstrained Optimization, as well as methods of Proof and Some Notation, and problems with Equality Constraints.
...
1
2
3
...