Speeding up distributed machine learning using codes

@inproceedings{Lee2016SpeedingUD,
  title={Speeding up distributed machine learning using codes},
  author={Kangwook Lee and Maximilian Lam and Ramtin Pedarsani and Dimitris S. Papailiopoulos and Kannan Ramchandran},
  booktitle={ISIT},
  year={2016}
}
Codes are widely used in many engineering applications to offer some form of reliability and fault tolerance. The high-level idea of coding is to exploit resource redundancy to deliver higher robustness against system noise. In distributed systems, there are several types of “noise” that can affect ML algorithms: straggler nodes, system failures, communication bottlenecks, etc. Moreover, redundancy is abundant: a plethora of nodes, a lot of spare storage, etc. However, there has been little… CONTINUE READING
Highly Influential
This paper has highly influenced 29 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 154 citations. REVIEW CITATIONS
109 Citations
11 References
Similar Papers

Citations

Publications citing this paper.
Showing 1-10 of 109 extracted citations

155 Citations

05010020152016201720182019
Citations per Year
Semantic Scholar estimates that this publication has 155 citations based on the available data.

See our FAQ for additional information.

Similar Papers

Loading similar papers…