Communication-Efficient Distributed Dual Coordinate Ascent

Abstract

Communication remains the most significant bottleneck in the performance of distributed optimization algorithms for large-scale machine learning. In this paper , we propose a communication-efficient framework, COCOA, that uses local computation in a primal-dual setting to dramatically reduce the amount of necessary communication. We provide a strong convergence rate analysis for this class of algorithms, as well as experiments on real-world distributed datasets with implementations in Spark. In our experiments, we find that as compared to state-of-the-art mini-batch versions of SGD and SDCA algorithms, COCOA converges to the same .001-accurate solution quality on average 25× as quickly.

Extracted Key Phrases

Showing 1-10 of 28 references

Primal-Dual Parallel Coordinate Descent for Machine Learning Optimization

  • Martin Takáč, Peter Richtárik, Nathan Srebro
  • 2014
1 Excerpt
Showing 1-10 of 68 extracted citations
0204060201220132014201520162017
Citations per Year

99 Citations

Semantic Scholar estimates that this publication has received between 71 and 144 citations based on the available data.

See our FAQ for additional information.