• Corpus ID: 13861512

Large-scale randomized-coordinate descent methods with non-separable linear constraints

@inproceedings{Reddi2015LargescaleRD,
  title={Large-scale randomized-coordinate descent methods with non-separable linear constraints},
  author={Sashank J. Reddi and Ahmed S. Hefny and Carlton Downey and Kumar Avinava Dubey and Suvrit Sra},
  booktitle={UAI},
  year={2015}
}
We develop randomized (block) coordinate descent (CD) methods for linearly constrained convex optimization. Unlike most CD methods, we do not assume the constraints to be separable, but let them be coupled linearly. To our knowledge, ours is the first CD method that allows linear coupling constraints, without making the global iteration complexity have an exponential dependence on the number of constraints. We present algorithms and analysis for four key problem scenarios: (i) smooth; (ii… 

Figures from this paper

Randomized sketch descent methods for non-separable linearly constrained optimization
TLDR
This is the first convergence analysis of random sketch descent algorithms for optimization problems with multiple non-separable linear constraints and shows that when random sketch is sketching the coordinate directions randomly, it produces better results than the fixed selection rule.
Random Block Coordinate Descent Methods for Linearly Constrained Optimization over Networks
TLDR
This paper develops random block coordinate descent methods for minimizing large-scale linearly constrained convex problems over networks by devise an algorithm that updates in parallel at each iteration at least two random components of the solution, chosen according to a given probability distribution.
Randomized Subspace Descent
TLDR
A generalization of randomized coordinate descent for smooth convex problems, where the coordinates specify arbitrary subspaces, is developed, and a convergence rate on a given graph in terms of its algebraic connectivity is derived.
An almost cyclic 2-coordinate descent method for singly linearly constrained problems
TLDR
A block decomposition method is proposed for minimizing a (possibly non-convex) continuously differentiable function subject to one linear equality constraint and simple bounds on the variables, allowing us not to compute the whole gradient of the objective function during the algorithm.
Accelerated Stochastic Block Coordinate Descent with Optimal Sampling
TLDR
This work proposes an accelerated stochastic block coordinate descent (ASBCD) algorithm, which incorporates the incrementally averaged partial derivative into the Stochastic partial derivative and exploits optimal sampling, and proves that ASBCD attains a linear rate of convergence.
Stochastic Coordinate Minimization with Progressive Precision for Stochastic Convex Optimization
TLDR
An interesting finding is that the optimal progression of precision across iterations is independent of the low-dimensional CM routine employed, suggesting a general framework for extending low- dimensional optimization routines to high-dimensional problems.
Accelerated Stochastic Block Coordinate Gradient Descent for Sparsity Constrained Nonconvex Optimization
TLDR
An accelerated stochastic block coordinate descent algorithm for nonconvex optimization under sparsity constraint in the high dimensional regime is proposed that converges to the unknown true parameter at a linear rate.
Randomized Block Subgradient Methods for Convex Nonsmooth and Stochastic Optimization
TLDR
Stochastic block dual averaging (SBDA) is presented---a novel class of block subgradient methods for convex nonsmooth and stochastic optimization and introduces randomized stepsize rules and block sampling schemes that are adaptive to the block structures that significantly improves the convergence rate w.r.t. the problem parameters.
On Variance Reduction in Stochastic Gradient Descent and its Asynchronous Variants
TLDR
This work proposes an asynchronous algorithm grounded in a unifying framework for many variance reduction techniques, and proves its fast convergence in sparse settings common to machine learning.
A flexible sequential Monte Carlo algorithm for shape-constrained regression
We propose an algorithm that is capable of imposing shape constraints on regression curves, without requiring the constraints to be written as closedform expressions, nor assuming the functional form
...
1
2
...

References

SHOWING 1-10 OF 52 REFERENCES
A RANDOM COORDINATE DESCENT METHOD ON LARGE-SCALE OPTIMIZATION PROBLEMS WITH LINEAR CONSTRAINTS
TLDR
A random block coordinate descent method for minimizing large-scale convex problems with linearly coupled constraints is developed and it is proved that it obtains in expectation an ε-accurate solution in at most O( 1 ε ) iterations.
A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
TLDR
If the smooth part of the objective function has Lipschitz continuous gradient, then it is proved that the random coordinate descent method obtains an ϵ-optimal solution in $\mathcal{O}(n^{2}/\epsilon)$ iterations, where n is the number of blocks.
An asynchronous parallel stochastic coordinate descent algorithm
We describe an asynchronous parallel stochastic coordinate descent algorithm for minimizing smooth unconstrained or separably constrained functions. The method achieves a linear convergence rate on
Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization
TLDR
A block-coordinate gradient descent method is proposed for solving the problem of minimizing the weighted sum of a smooth function f and a convex function P of n real variables subject to m linear equality constraints, with the coordinate block chosen by a Gauss-Southwell-q rule based on sufficient predicted descent.
Iteration complexity analysis of block coordinate descent methods
TLDR
This paper unify these algorithms under the so-called block successive upper-bound minimization (BSUM) framework, and shows that for a broad class of multi-block nonsmooth convex problems, all algorithms achieve a global sublinear iteration complexity of O(1/r), where r is the iteration index.
On the Convergence of Block Coordinate Descent Type Methods
TLDR
This paper analyzes the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order and establishes global sublinear rate of convergence.
Parallel coordinate descent methods for big data optimization
In this work we show that randomized (block) coordinate descent methods can be accelerated by parallelization when applied to the problem of minimizing the sum of a partially separable smooth convex
Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
  • Y. Nesterov
  • Computer Science, Mathematics
    SIAM J. Optim.
  • 2012
TLDR
Surprisingly enough, for certain classes of objective functions, the proposed methods for solving huge-scale optimization problems are better than the standard worst-case bounds for deterministic algorithms.
Inexact Coordinate Descent: Complexity and Preconditioning
TLDR
This work allows for the subproblem to be solved inexactly, leading to an inexact block coordinate descent method, which incorporates the best known results for exact updates as a special case.
Distributed Coordinate Descent Method for Learning with Big Data
TLDR
This paper develops and analyzes Hydra: HYbriD cooRdinAte descent method for solving loss minimization problems with big data, and gives bounds on the number of iterations sufficient to approximately solve the problem with high probability.
...
1
2
3
4
5
...