High-Performance Distributed ML at Scale through Parameter Server Consistency Models

@inproceedings{Dai2015HighPerformanceDM,
  title={High-Performance Distributed ML at Scale through Parameter Server Consistency Models},
  author={Wei Dai and Abhimanu Kumar and Jinliang Wei and Qirong Ho and Garth A. Gibson and Eric P. Xing},
  booktitle={AAAI},
  year={2015}
}
As Machine Learning (ML) applications embrace greater data size and model complexity, practitioners turn to distributed clusters to satisfy the increased computational and memory demands. Effective use of clusters for ML programs requires considerable expertise in writing distributed code, but existing highlyabstracted frameworks like Hadoop that pose low bar-ed frameworks like Hadoop that pose low barriers to distributed-programming have not, in practice, matched the performance seen in highly… CONTINUE READING
Highly Influential
This paper has highly influenced 10 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 70 citations. REVIEW CITATIONS
Related Discussions
This paper has been referenced on Twitter 11 times. VIEW TWEETS

From This Paper

Figures, tables, and topics from this paper.

Citations

Publications citing this paper.
Showing 1-10 of 45 extracted citations

Data Science and Big Data Computing

Springer International Publishing • 2016
View 9 Excerpts
Highly Influenced

Distributed frank-wolfe under pipelined stale synchronous parallelism

2015 IEEE International Conference on Big Data (Big Data) • 2015
View 5 Excerpts
Highly Influenced

Latent Space Inference of Internet-Scale Networks

Journal of Machine Learning Research • 2016
View 4 Excerpts
Method Support
Highly Influenced

More Effective Synchronization Scheme in ML Using Stale Parameters

2016 IEEE 18th International Conference on High Performance Computing and Communications; IEEE 14th International Conference on Smart City; IEEE 2nd International Conference on Data Science and Systems (HPCC/SmartCity/DSS) • 2016
View 3 Excerpts
Highly Influenced

Petuum: A New Platform for Distributed Machine Learning on Big Data

IEEE Transactions on Big Data • 2015
View 7 Excerpts
Method Support
Highly Influenced

70 Citations

01020'14'15'16'17'18'19
Citations per Year
Semantic Scholar estimates that this publication has 70 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 20 references

Exploiting Bounded Staleness to Speed Up Big Data Analytics

USENIX Annual Technical Conference • 2014
View 5 Excerpts
Highly Influenced

Solving the Straggler Problem with Bounded Staleness

HotOS • 2013
View 5 Excerpts
Highly Influenced