Corpus ID: 220633009

Privacy-Preserving Distributed Learning in the Analog Domain

@article{Soleymani2020PrivacyPreservingDL,
  title={Privacy-Preserving Distributed Learning in the Analog Domain},
  author={M. Soleymani and Hessam Mahdavifar and A. Avestimehr},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.08803}
}
We consider the critical problem of distributed learning over data while keeping it private from the computational servers. The state-of-the-art approaches to this problem rely on quantizing the data into a finite field, so that the cryptographic approaches for secure multiparty computing can then be employed. These approaches, however, can result in substantial accuracy losses due to fixed-point representation of the data and computation overflows. To address these critical issues, we propose… Expand
CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning
List-Decodable Coded Computing: Breaking the Adversarial Toleration Barrier
Analog Lagrange Coded Computing
Coded Computing via Binary Linear Codes: Designs and Performance Limits
Coded Machine Unlearning

References

SHOWING 1-10 OF 48 REFERENCES
Private and Secure Distributed Matrix Multiplication With Flexible Communication Load
A Cryptographic Treatment of the Wiretap Channel
CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning
Secure Coded Multi-Party Computation for Massive Matrix Operations
Distributed Multi-User Secret Sharing
Entangled Polynomial Codes for Secure, Private, and Batch Distributed Matrix Multiplication: Breaking the "Cubic" Barrier
  • Qian Yu, A. Avestimehr
  • Computer Science, Mathematics
  • 2020 IEEE International Symposium on Information Theory (ISIT)
  • 2020
Straggler Mitigation in Distributed Matrix Multiplication: Fundamental Limits and Optimal Coding
...
1
2
3
4
5
...