Fast distributed coordinate descent for non-strongly convex losses

@article{Fercoq2014FastDC,
  title={Fast distributed coordinate descent for non-strongly convex losses},
  author={Olivier Fercoq and Z. Qu and Peter Richt{\'a}rik and Martin Tak{\'a}c},
  journal={2014 IEEE International Workshop on Machine Learning for Signal Processing (MLSP)},
  year={2014},
  pages={1-6}
}
  • Olivier Fercoq, Z. Qu, +1 author Martin Takác
  • Published 2014
  • Computer Science, Mathematics
  • 2014 IEEE International Workshop on Machine Learning for Signal Processing (MLSP)
  • We propose an efficient distributed randomized coordinate descent method for minimizing regularized non-strongly convex loss functions. The method attains the optimal O(1/k2) convergence rate, where k is the iteration counter. The core of the work is the theoretical study of stepsize parameters. We have implemented the method on Archer - the largest super-computer in the UK-and show that the method is capable of solving a (synthetic) LASSO optimization problem with 50 billion variables. 
    Distributed Block Coordinate Descent for Minimizing Partially Separable Functions
    55
    Randomized Dual Coordinate Ascent with Arbitrary Sampling
    53
    Projected Semi-Stochastic Gradient Descent Method with Mini-Batch Scheme under Weak Strong Convexity Assumption
    3
    Coordinate Descent Faceoff: Primal or Dual?
    1
    Communication-Efficient Distributed Dual Coordinate Ascent
    256
    Explorer Coordinate Descent with Arbitrary Sampling I : Algorithms and Complexity
    Title Coordinate descent with arbitrary sampling I : algorithms andcomplexity
    Adding vs. Averaging in Distributed Primal-Dual Optimization
    130

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 15 REFERENCES
    Parallel Coordinate Descent for L1-Regularized Loss Minimization
    287
    Distributed Coordinate Descent Method for Learning with Big Data
    107
    Parallel coordinate descent methods for big data optimization
    383
    Smooth minimization of nonsmooth functions with parallel coordinate descent methods
    133
    Stochastic dual coordinate ascent methods for regularized loss
    812
    Accelerated, Parallel, and Proximal Coordinate Descent
    270
    Mini-Batch Primal and Dual Methods for SVMs
    164
    A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
    7493
    A fast iterative shrinkagethresholding algorithm for linear inverse problems
    • 2009
    A method for solving the convex programming problem with convergence rate O(1/k^2)
    2256