Privacy Risk in Machine Learning: Analyzing the Connection to Overfitting

  title={Privacy Risk in Machine Learning: Analyzing the Connection to Overfitting},
  author={Samuel Yeom and Irene Giacomelli and Matt Fredrikson and Somesh Jha},
  journal={2018 IEEE 31st Computer Security Foundations Symposium (CSF)},
Machine learning algorithms, when applied to sensitive data, pose a distinct threat to privacy. A growing body of prior work demonstrates that models produced by these algorithms may leak specific private information in the training data to an attacker, either through the models' structure or their observable behavior. However, the underlying cause of this privacy risk is not well understood beyond a handful of anecdotal accounts that suggest overfitting and influence might play a role. This… CONTINUE READING
Related Discussions
This paper has been referenced on Twitter 28 times. VIEW TWEETS


Publications referenced by this paper.
Showing 1-10 of 54 references

The MNIST Database of Handwritten Digit Images for Machine Learning Research [Best of the Web]

IEEE Signal Processing Magazine • 2012
View 4 Excerpts
Highly Influenced

Membership Inference Attacks Against Machine Learning Models

2017 IEEE Symposium on Security and Privacy (SP) • 2017
View 10 Excerpts
Highly Influenced

Membership privacy: a unifying framework for privacy definitions

ACM Conference on Computer and Communications Security • 2013
View 5 Excerpts
Highly Influenced

Keras: Deep learning library for Theano and TensorFlow

F. Chollet, 2017. • 2017
View 1 Excerpt

Machine Learning Models that Remember Too Much

ACM Conference on Computer and Communications Security • 2017
View 2 Excerpts

Privacy risk in machine learning: Analyzing the connection to overfitting

S. Yeom, I. Giacomelli, M. Fredrikson, S. Jha
CoRR, vol. abs/1709.01604, 2017. • 2017
View 1 Excerpt

A Methodology for Formalizing Model-Inversion Attacks

2016 IEEE 29th Computer Security Foundations Symposium (CSF) • 2016
View 7 Excerpts

Similar Papers

Loading similar papers…