On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes

@article{Beck2015OnTC,
  title={On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes},
  author={Amir Beck},
  journal={SIAM J. Optim.},
  year={2015},
  volume={25},
  pages={185-209}
}
  • A. Beck
  • Published 15 January 2015
  • Mathematics, Computer Science
  • SIAM J. Optim.
This paper is concerned with the alternating minimization (AM) method for solving convex minimization problems where the decision variables vector is split into two blocks. The objective function is a sum of a differentiable convex function and a separable (possibly) nonsmooth extended real-valued convex function, and consequently constraints can be incorporated. We analyze the convergence rate of the method and establish a nonasymptotic sublinear rate of convergence where the multiplicative… 
Alternating minimization methods for strongly convex optimization
TLDR
Under the strong convexity assumption in the many-blocks setting, an accelerated alternating minimization procedure with linear convergence rate depending on the square root of the condition number as opposed to just the conditionNumber for the non-accelerated method is provided.
The Analysis of Alternating Minimization Method for Double Sparsity Constrained Optimization Problem
TLDR
This work analyzes the alternating minimization (AM) method for solving double sparsity constrained minimization problem, where the decision variable vector is split into two blocks and establishes a non-asymptotic sub-linear rate of convergence under the assumption of convexity.
On a Combination of Alternating Minimization and Nesterov's Momentum
TLDR
This paper combines AM and Nesterov’s acceleration to propose an accelerated alternating minimization algorithm that is adaptive to convexity and smoothness and is uniformly optimal for smooth convex and non-convex problems.
On the rate of convergence of the proximal alternating linearized minimization algorithm for convex problems
We analyze the proximal alternating linearized minimization algorithm (PALM) for solving non-smooth convex minimization problems where the objective function is a sum of a smooth convex function and
On the Convergence of Multi-Block Alternating Direction Method of Multipliers and Block Coordinate Descent Method
TLDR
The paper analyzes the convergence of the 2-block ADMM for solving the linearly constrained convex optimization with coupled quadratic objective, and shows that the classical ADMM point-wisely converges to a primal-dual solution pair of this problem.
Convergence Analysis of Alternating Nonconvex Projections
TLDR
A new convergence analysis framework is established to show that if one set satisfies the three point property and the other one obeys the local contraction property, the iterates generated by alternating projections is a convergent sequence and converges to a critical point.
Convergence Analysis of Alternating Projection Method for Nonconvex Sets
TLDR
This paper formalizes two properties of proper, lower semi-continuous and semi-algebraic sets: the three-point property for all possible iterates and the local contraction property that serves as the non-expensiveness property of the projector but only for the iterates that are close enough to each other.
Alternating minimization and alternating descent over nonconvex sets
TLDR
This work analyzes the performance of alternating minimization for loss functions optimized over two variables, where each variable may be restricted to lie in some potentially nonconvex constraint set and relies on the notion of local concavity coefficients, which has been proposed in Barber and Ha to measure and quantify the Concavity of a general nonconcex set.
On the rate of convergence of alternating minimization for non-smooth non-strongly convex optimization in Banach spaces
In this paper, the convergence of the fundamental alternating minimization is established for non-smooth non-strongly convex optimization problems in Banach spaces, and novel rates of convergence are
A proximal block minimization method of multipliers with a substitution procedure
TLDR
This paper proposes a new algorithm called the proximal block minimization method of multipliers with a substitution to solve this family of problems of this form and proves its convergence via the analytic framework of contractive type methods and derives a worst-case convergence rate in an ergodic sense.
...
...

References

SHOWING 1-10 OF 39 REFERENCES
Convergence Analysis of Generalized Iteratively Reweighted Least Squares Algorithms on Convex Function Spaces
TLDR
This paper discusses a general technique for a large class of convex functionals to compute the minimizers iteratively, which is closely related to majorization-minimization algorithms and includes the iteratively reweighted least squares algorithm as a special case.
Gradient methods for minimizing composite objective function
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and
Gradient methods for minimizing composite functions
  • Y. Nesterov
  • Mathematics, Computer Science
    Math. Program.
  • 2013
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is
Iteration-Complexity of Block-Decomposition Algorithms and the Alternating Direction Method of Multipliers
TLDR
A framework of block-decomposition prox-type algorithms for solving the monotone inclusion problem and shows that any method in this framework is also a special instance of the hybrid proximal extragradient (HPE) method introduced by Solodov and Svaiter is shown.
Rate of Convergence Analysis of Decomposition Methods Based on the Proximal Method of Multipliers for Convex Minimization
TLDR
It is proved various sublinear global convergence rate results for the two classes of PMM based decomposition algorithms for function values and constraints violation and convergence of the sequences produced by the two algorithm classes to optimal primal-dual solutions.
On the Convergence of Block Coordinate Descent Type Methods
TLDR
This paper analyzes the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order and establishes global sublinear rate of convergence.
Applications of splitting algorithm to decomposition in convex programming and variational inequalities
Recently Han and Lou proposed a highly parallelizable decomposition algorithm for minimizing a strongly convex cost over the intersection of closed convex sets. It is shown that their algorithm is in
Globally convergent block-coordinate techniques for unconstrained optimization
TLDR
New classes of globally convergent block-coordinate techniques for the unconstrained minimization of a continuously differentiable function and line-search-based schemes that may also include partial global inimizations with respect to some component are defined.
A General Framework for a Class of First Order Primal-Dual Algorithms for Convex Optimization in Imaging Science
TLDR
This work generalizes the primal-dual hybrid gradient (PDHG) algorithm to a broader class of convex optimization problems, and surveys several closely related methods and explains the connections to PDHG.
Iteratively reweighted least squares minimization for sparse recovery
TLDR
It is proved that when Φ satisfies the RIP conditions, the sequence x(n) converges for all y, regardless of whether Φ−1(y) contains a sparse vector.
...
...