• Corpus ID: 238743806

Offset-Symmetric Gaussians for Differential Privacy

@article{Sadeghi2021OffsetSymmetricGF,
  title={Offset-Symmetric Gaussians for Differential Privacy},
  author={Parastoo Sadeghi and Mehdi Korki},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.06412}
}
The Gaussian distribution is widely used in mechanism design for differential privacy (DP). Thanks to its subGaussian tail, it significantly reduces the chance of outliers when responding to queries. However, it can only provide approximate ( , δ( ))-DP. In practice, δ( ) must be much smaller than the size of the dataset, which may limit the use of the Gaussian mechanism for large datasets with strong privacy requirements. In this paper, we introduce and analyze a new distribution for use in DP… 

Figures from this paper

References

SHOWING 1-10 OF 29 REFERENCES
Improving the Gaussian Mechanism for Differential Privacy: Analytical Calibration and Optimal Denoising
TLDR
An optimal Gaussian mechanism is developed whose variance is calibrated directly using the Gaussian cumulative density function instead of a tail bound approximation and equipped with a post-processing step based on adaptive estimation techniques by leveraging that the distribution of the perturbation is known.
Generalized Gaussian Mechanism for Differential Privacy
  • F. Liu
  • Computer Science, Mathematics
    IEEE Transactions on Knowledge and Data Engineering
  • 2019
TLDR
This paper generalizes the widely used Laplace mechanism to the family of generalized Gaussian (GG) mechanism based on the LaTeX notation, and presents a lower bound on the scale parameter of the Gaussian mechanism of <inline-formula><tex-math notation="LaTeX">$(\epsilon, \delta)$</tex- math><alternatives>-probabilistic DP.
Preserving differential privacy under finite-precision semantics
TLDR
It is shown that in general there are violations of the differential privacy property, and the conditions under which a limited (but, arguably, totally acceptable) variant of the property can still be guaranteed, under only a minor degradation of the privacy level.
Calibrating Noise to Sensitivity in Private Data Analysis
TLDR
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.
Gaussian Differential Privacy
TLDR
A new relaxation of privacy is proposed, which has a number of appealing properties and, in particular, avoids difficulties associated with divergence based relaxations, and is introduced as `Gaussian differential privacy' (GDP), defined based on testing two shifted Gaussians.
Universally utility-maximizing privacy mechanisms
TLDR
Every potential user u, no matter what its side information and preferences, derives as much utility from M* as from interacting with a differentially private mechanism Mu that is optimally tailored to u, subject to differential privacy.
Smooth sensitivity and sampling in private data analysis
TLDR
This is the first formal analysis of the effect of instance-based noise in the context of data privacy, and shows how to do this efficiently for several different functions, including the median and the cost of the minimum spanning tree.
cpSGD: Communication-efficient and differentially-private distributed SGD
TLDR
This work extends and improves previous analysis of the Binomial mechanism showing that it achieves nearly the same utility as the Gaussian mechanism, while requiring fewer representation bits, which can be of independent interest.
Our Data, Ourselves: Privacy Via Distributed Noise Generation
TLDR
This work provides efficient distributed protocols for generating shares of random noise, secure against malicious participants, and introduces a technique for distributing shares of many unbiased coins with fewer executions of verifiable secret sharing than would be needed using previous approaches.
A Better Bound Gives a Hundred Rounds: Enhanced Privacy Guarantees via f-Divergences
We derive the optimal differential privacy (DP) parameters of a mechanism that satisfies a given level of Renyi´ differential privacy (RDP). Our result is based on the joint range of two
...
1
2
3
...