• Corpus ID: 215768661

Unifying Privacy Loss Composition for Data Analytics

@article{Cesar2020UnifyingPL,
  title={Unifying Privacy Loss Composition for Data Analytics},
  author={Mark Cesar and Ryan M. Rogers},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.07223}
}
Differential privacy (DP) provides rigorous privacy guarantees on individual's data while also allowing for accurate statistics to be conducted on the overall, sensitive dataset. To design a private system, first private algorithms must be designed that can quantify the privacy loss of each outcome that is released. However, private algorithms that inject noise into the computation are not sufficient to ensure individuals' data is protected due to many noisy results ultimately concentrating to… 
1 Citations
Iterative Methods for Private Synthetic Data: Unifying Framework and New Methods
TLDR
Generative networks with the exponential mechanism (GEM) circumvents computational bottlenecks in algorithms such as MWEM and PEP by optimizing over generative models parameterized by neural networks, which capture a rich family of distributions while enabling fast gradient-based optimization.

References

SHOWING 1-10 OF 21 REFERENCES
The Composition Theorem for Differential Privacy
TLDR
This paper proves an upper bound on the overall privacy level and construct a sequence of privatization mechanisms that achieves this bound by introducing an operational interpretation of differential privacy and the use of a data processing inequality.
Calibrating Noise to Sensitivity in Private Data Analysis
TLDR
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.
Mechanism Design via Differential Privacy
TLDR
It is shown that the recent notion of differential privacv, in addition to its own intrinsic virtue, can ensure that participants have limited effect on the outcome of the mechanism, and as a consequence have limited incentive to lie.
Privacy Odometers and Filters: Pay-as-you-Go Composition
TLDR
The study of adaptive composition in differential privacy when the length of the composition, and the privacy parameters themselves can be chosen adaptively, as a function of the outcome of previously run analyses is initiated.
Rényi Differential Privacy
  • Ilya Mironov
  • Computer Science
    2017 IEEE 30th Computer Security Foundations Symposium (CSF)
  • 2017
TLDR
This work argues that the useful analytical tool can be used as a privacy definition, compactly and accurately representing guarantees on the tails of the privacy loss, and demonstrates that the new definition shares many important properties with the standard definition of differential privacy.
Our Data, Ourselves: Privacy Via Distributed Noise Generation
TLDR
This work provides efficient distributed protocols for generating shares of random noise, secure against malicious participants, and introduces a technique for distributing shares of many unbiased coins with fewer executions of verifiable secret sharing than would be needed using previous approaches.
Concentrated Differential Privacy: Simplifications, Extensions, and Lower Bounds
TLDR
This work presents an alternative formulation of the concept of concentrated differential privacy in terms of the Renyi divergence between the distributions obtained by running an algorithm on neighboring inputs, which proves sharper quantitative results, establishes lower bounds, and raises a few new questions.
Practical Differentially Private Top-k Selection with Pay-what-you-get Composition
TLDR
This work designs algorithms that ensures (approximate) $(\epsilon,\delta>0)$-differential privacy and only needs access to the true top-$\bar{k}$ elements from the data for any chosen $\bar{ k} \geq k$.
Discovering frequent patterns in sensitive data
TLDR
This paper shows how one can accurately discover and release the most significant patterns along with their frequencies in a data set containing sensitive information, while providing rigorous guarantees of privacy for the individuals whose information is stored there.
Improving the Gaussian Mechanism for Differential Privacy: Analytical Calibration and Optimal Denoising
TLDR
An optimal Gaussian mechanism is developed whose variance is calibrated directly using the Gaussian cumulative density function instead of a tail bound approximation and equipped with a post-processing step based on adaptive estimation techniques by leveraging that the distribution of the perturbation is known.
...
1
2
3
...