Corpus ID: 207757266

SPARQ-SGD: Event-Triggered and Compressed Communication in Decentralized Stochastic Optimization

@article{Singh2019SPARQSGDEA,
  title={SPARQ-SGD: Event-Triggered and Compressed Communication in Decentralized Stochastic Optimization},
  author={Navjot Singh and Deepesh Data and Jemin George and S. Diggavi},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.14280}
}
In this paper, we propose and analyze SPARQ-SGD, which is an event-triggered and compressed algorithm for decentralized training of large-scale machine learning models. Each node can locally compute a condition (event) which triggers a communication where quantized and sparsified local model parameters are sent. In SPARQ-SGD each node takes at least a fixed number ($H$) of local gradient steps and then checks if the model parameters have significantly changed compared to its last update; it… Expand
SPARQ-SGD: Event-Triggered and Compressed Communication in Decentralized Optimization
Decentralized Federated Learning via SGD over Wireless D2D Networks
  • Hong Xing, O. Simeone, S. Bi
  • Computer Science, Engineering
  • 2020 IEEE 21st International Workshop on Signal Processing Advances in Wireless Communications (SPAWC)
  • 2020
Shuffled Model of Federated Learning: Privacy, Accuracy and Communication Trade-Offs
Federated Learning over Wireless Device-to-Device Networks: Algorithms and Convergence Analysis
Randomized Reactive Redundancy for Byzantine Fault-Tolerance in Parallelized Learning
A Decentralized Approach to Bayesian Learning
Communication Efficient Distributed Learning with Censored, Quantized, and Generalized Group ADMM
...
1
2
...

References

SHOWING 1-10 OF 34 REFERENCES
Qsparse-Local-SGD: Distributed SGD With Quantization, Sparsification, and Local Computations
Communication Compression for Decentralized Training
Local SGD Converges Fast and Communicates Little
Sparsified SGD with Memory
...
1
2
3
4
...