Fast distributed coordinate descent for non-strongly convex losses

@article{Fercoq2014FastDC,
  title={Fast distributed coordinate descent for non-strongly convex losses},
  author={Olivier Fercoq and Zheng Qu and Peter Richt{\'a}rik and Martin Tak{\'a}c},
  journal={2014 IEEE International Workshop on Machine Learning for Signal Processing (MLSP)},
  year={2014},
  pages={1-6}
}
We propose an efficient distributed randomized coordinate descent method for minimizing regularized non-strongly convex loss functions. The method attains the optimal O(1/k2) convergence rate, where k is the iteration counter. The core of the work is the theoretical study of stepsize parameters. We have implemented the method on Archer - the largest super-computer in the UK-and show that the method is capable of solving a (synthetic) LASSO optimization problem with 50 billion variables. 
Highly Cited
This paper has 64 citations. REVIEW CITATIONS

4 Figures & Tables

Topics

Statistics

010203020142015201620172018
Citations per Year

65 Citations

Semantic Scholar estimates that this publication has 65 citations based on the available data.

See our FAQ for additional information.

  • GitHub repos referencing this paper

  • Presentations referencing similar topics