Communication-Efficient Distributed Dual Coordinate Ascent

Abstract

Communication remains the most significant bottleneck in the performance of distributed optimization algorithms for large-scale machine learning. In this paper, we propose a communication-efficient framework, COCOA, that uses local computation in a primal-dual setting to dramatically reduce the amount of necessary communication. We provide a strong convergence rate analysis for this class of algorithms, as well as experiments on real-world distributed datasets with implementations in Spark. In our experiments, we find that as compared to stateof-the-art mini-batch versions of SGD and SDCA algorithms, COCOA converges to the same .001-accurate solution quality on average 25× as quickly.

Extracted Key Phrases

5 Figures and Tables

0204060201220132014201520162017
Citations per Year

122 Citations

Semantic Scholar estimates that this publication has 122 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Jaggi2014CommunicationEfficientDD, title={Communication-Efficient Distributed Dual Coordinate Ascent}, author={Martin Jaggi and Virginia Smith and Martin Tak{\'a}c and Jonathan Terhorst and Sanjay Krishnan and Thomas Hofmann and Michael I. Jordan}, booktitle={NIPS}, year={2014} }