Stochastic gradient descent with differentially private updates

@article{Song2013StochasticGD,
  title={Stochastic gradient descent with differentially private updates},
  author={Shuang Song and Kamalika Chaudhuri and Anand D. Sarwate},
  journal={2013 IEEE Global Conference on Signal and Information Processing},
  year={2013},
  pages={245-248}
}
Differential privacy is a recent framework for computation on sensitive data, which has shown considerable promise in the regime of large datasets. Stochastic gradient methods are a popular approach for learning in the data-rich regime because they are computationally tractable and scalable. In this paper, we derive differentially private versions of stochastic gradient descent, and test them empirically. Our results show that standard SGD experiences high variability due to differential… CONTINUE READING
Highly Cited
This paper has 103 citations. REVIEW CITATIONS

From This Paper

Figures, tables, and topics from this paper.

Citations

Publications citing this paper.
Showing 1-10 of 76 extracted citations

Differentially private optimization algorithms for deep neural networks

2017 International Conference on Advanced Computer Science and Information Systems (ICACSIS) • 2017
View 4 Excerpts
Highly Influenced

A Survey on Collaborative Deep Learning and Privacy-Preserving

2018 IEEE Third International Conference on Data Science in Cyberspace (DSC) • 2018
View 1 Excerpt

ABY3: A Mixed Protocol Framework for Machine Learning

ACM Conference on Computer and Communications Security • 2018

103 Citations

02040'14'15'16'17'18'19
Citations per Year
Semantic Scholar estimates that this publication has 103 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.

Similar Papers

Loading similar papers…