On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes
@article{Beck2015OnTC, title={On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes}, author={Amir Beck}, journal={SIAM J. Optim.}, year={2015}, volume={25}, pages={185-209} }
This paper is concerned with the alternating minimization (AM) method for solving convex minimization problems where the decision variables vector is split into two blocks. The objective function is a sum of a differentiable convex function and a separable (possibly) nonsmooth extended real-valued convex function, and consequently constraints can be incorporated. We analyze the convergence rate of the method and establish a nonasymptotic sublinear rate of convergence where the multiplicative…
157 Citations
Alternating minimization methods for strongly convex optimization
- Mathematics, Computer ScienceJournal of Inverse and Ill-posed Problems
- 2019
Under the strong convexity assumption in the many-blocks setting, an accelerated alternating minimization procedure with linear convergence rate depending on the square root of the condition number as opposed to just the conditionNumber for the non-accelerated method is provided.
The Analysis of Alternating Minimization Method for Double Sparsity Constrained Optimization Problem
- Mathematics, Computer ScienceAsia Pac. J. Oper. Res.
- 2020
This work analyzes the alternating minimization (AM) method for solving double sparsity constrained minimization problem, where the decision variable vector is split into two blocks and establishes a non-asymptotic sub-linear rate of convergence under the assumption of convexity.
On a Combination of Alternating Minimization and Nesterov's Momentum
- Computer Science, MathematicsICML
- 2021
This paper combines AM and Nesterov’s acceleration to propose an accelerated alternating minimization algorithm that is adaptive to convexity and smoothness and is uniformly optimal for smooth convex and non-convex problems.
On the rate of convergence of the proximal alternating linearized minimization algorithm for convex problems
- Mathematics, Computer ScienceEURO J. Comput. Optim.
- 2016
We analyze the proximal alternating linearized minimization algorithm (PALM) for solving non-smooth convex minimization problems where the objective function is a sum of a smooth convex function and…
On the Convergence of Multi-Block Alternating Direction Method of Multipliers and Block Coordinate Descent Method
- Computer Science, Mathematics
- 2015
The paper analyzes the convergence of the 2-block ADMM for solving the linearly constrained convex optimization with coupled quadratic objective, and shows that the classical ADMM point-wisely converges to a primal-dual solution pair of this problem.
Convergence Analysis of Alternating Nonconvex Projections
- MathematicsArXiv
- 2018
A new convergence analysis framework is established to show that if one set satisfies the three point property and the other one obeys the local contraction property, the iterates generated by alternating projections is a convergent sequence and converges to a critical point.
Convergence Analysis of Alternating Projection Method for Nonconvex Sets
- Mathematics
- 2018
This paper formalizes two properties of proper, lower semi-continuous and semi-algebraic sets: the three-point property for all possible iterates and the local contraction property that serves as the non-expensiveness property of the projector but only for the iterates that are close enough to each other.
Alternating minimization and alternating descent over nonconvex sets
- Computer Science
- 2017
This work analyzes the performance of alternating minimization for loss functions optimized over two variables, where each variable may be restricted to lie in some potentially nonconvex constraint set and relies on the notion of local concavity coefficients, which has been proposed in Barber and Ha to measure and quantify the Concavity of a general nonconcex set.
On the rate of convergence of alternating minimization for non-smooth non-strongly convex optimization in Banach spaces
- MathematicsOptim. Lett.
- 2022
In this paper, the convergence of the fundamental alternating minimization is established for non-smooth non-strongly convex optimization problems in Banach spaces, and novel rates of convergence are…
A proximal block minimization method of multipliers with a substitution procedure
- Computer Science, MathematicsOptim. Methods Softw.
- 2015
This paper proposes a new algorithm called the proximal block minimization method of multipliers with a substitution to solve this family of problems of this form and proves its convergence via the analytic framework of contractive type methods and derives a worst-case convergence rate in an ergodic sense.
References
SHOWING 1-10 OF 39 REFERENCES
Convergence Analysis of Generalized Iteratively Reweighted Least Squares Algorithms on Convex Function Spaces
- Mathematics, Computer ScienceSIAM J. Optim.
- 2009
This paper discusses a general technique for a large class of convex functionals to compute the minimizers iteratively, which is closely related to majorization-minimization algorithms and includes the iteratively reweighted least squares algorithm as a special case.
Gradient methods for minimizing composite objective function
- Computer Science, Mathematics
- 2007
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and…
Gradient methods for minimizing composite functions
- Mathematics, Computer ScienceMath. Program.
- 2013
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is…
Iteration-Complexity of Block-Decomposition Algorithms and the Alternating Direction Method of Multipliers
- Mathematics, Computer ScienceSIAM J. Optim.
- 2013
A framework of block-decomposition prox-type algorithms for solving the monotone inclusion problem and shows that any method in this framework is also a special instance of the hybrid proximal extragradient (HPE) method introduced by Solodov and Svaiter is shown.
Rate of Convergence Analysis of Decomposition Methods Based on the Proximal Method of Multipliers for Convex Minimization
- Computer Science, MathematicsSIAM J. Optim.
- 2014
It is proved various sublinear global convergence rate results for the two classes of PMM based decomposition algorithms for function values and constraints violation and convergence of the sequences produced by the two algorithm classes to optimal primal-dual solutions.
On the Convergence of Block Coordinate Descent Type Methods
- Mathematics, Computer ScienceSIAM J. Optim.
- 2013
This paper analyzes the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order and establishes global sublinear rate of convergence.
Applications of splitting algorithm to decomposition in convex programming and variational inequalities
- Mathematics
- 1991
Recently Han and Lou proposed a highly parallelizable decomposition algorithm for minimizing a strongly convex cost over the intersection of closed convex sets. It is shown that their algorithm is in…
Globally convergent block-coordinate techniques for unconstrained optimization
- Mathematics, Computer Science
- 1999
New classes of globally convergent block-coordinate techniques for the unconstrained minimization of a continuously differentiable function and line-search-based schemes that may also include partial global inimizations with respect to some component are defined.
A General Framework for a Class of First Order Primal-Dual Algorithms for Convex Optimization in Imaging Science
- MathematicsSIAM J. Imaging Sci.
- 2010
This work generalizes the primal-dual hybrid gradient (PDHG) algorithm to a broader class of convex optimization problems, and surveys several closely related methods and explains the connections to PDHG.
Iteratively reweighted least squares minimization for sparse recovery
- Computer Science, Mathematics
- 2008
It is proved that when Φ satisfies the RIP conditions, the sequence x(n) converges for all y, regardless of whether Φ−1(y) contains a sparse vector.