The Laplace Mechanism has optimal utility for differential privacy over continuous queries

@article{Fernandes2021TheLM,
  title={The Laplace Mechanism has optimal utility for differential privacy over continuous queries},
  author={Natasha Fernandes and Annabelle McIver and Carroll Morgan},
  journal={2021 36th Annual ACM/IEEE Symposium on Logic in Computer Science (LICS)},
  year={2021},
  pages={1-12}
}
Differential Privacy protects individuals’ data when statistical queries are published from aggregated databases: applying "obfuscating" mechanisms to the query results makes the released information less specific but, unavoidably, also decreases its utility. Yet it has been shown that for discrete data (e.g. counting queries), a mandated degree of privacy and a reasonable interpretation of loss of utility, the Geometric obfuscating mechanism is optimal: it loses as little utility as possible… 

Figures from this paper

Three-way optimization of privacy and utility of location data
TLDR
This work proposes a method to produce a geo-indistinguishable location-privacy mechanism that advances to optimize simultaneously between the level of privacy attained, the QoS, and the statistical utility achieved by the obfuscated data.
Group privacy for personalized federated learning
TLDR
A method to provide group privacy guarantees exploiting some key properties of 𝑑 -privacy which enables personalized models under the framework of FL to address the issue of protecting the privacy of the clients and allow for personalized model training to enhance the fairness and utility of the system.
A privacy preserving querying mechanism with high utility for electric vehicles
TLDR
This paper proposes a novel method to protect the privacy of EVs, both for the individual query locations and against the threat of tracing the trajectory of their journeys from adversarial third-parties, by applying AGeoI that enables the users to substantially preserve their quality of service (QoS).
Universal Optimality and Robust Utility Bounds for Metric Differential Privacy
TLDR
It is suggested that universally optimal mechanisms are indeed rare within privacy types, and a weaker universal benchmarks of utility called privacy type ca- pacities is proposed that can be computed using a convex optimisation algorithm.
How to Develop an Intuition for Risk... and Other Invisible Phenomena (Invited Talk)
TLDR
A faithful geometrical setting for the channel model of quantitative information flow (QIF) is described and it is demonstrated how it can facilitate “proofs without words” for problems in the QIF setting.
Offset-Symmetric Gaussians for Differential Privacy
  • P. Sadeghi, M. Korki
  • Computer Science
    IEEE Transactions on Information Forensics and Security
  • 2022
TLDR
A new distribution for use in DP that is based on the Gaussian distribution, but has improved privacy performance, and a method for post processing the output of the OSGT mechanism to approximate the querybased on the minimum mean square error (MMSE) estimation technique is derived.

References

SHOWING 1-10 OF 31 REFERENCES
Broadening the Scope of Differential Privacy Using Metrics
TLDR
Differential Privacy is one of the most prominent frameworks used to deal with disclosure prevention in statistical databases, ensuring that sensitive information relative to individuals cannot be easily inferred by disclosing answers to aggregate queries.
Universally utilitymaximising privacy mechanisms
  • SIAM J. COMPUT, vol. 41, no. 6, pp. 1673–1693, 2012.
  • 2012
The Science of Quantitative Information Flow
Near Instance-Optimality in Differential Privacy
TLDR
Two notions of instance optimality in differential privacy are developed by defining a local minimax risk and the other by considering unbiased mechanisms and analogizing the Cramer-Rao bound, and it is shown that the local modulus of continuity of the estimand of interest completely determines these quantities.
Comparing Systems: Max-Case Refinement Orders and Application to Differential Privacy
TLDR
A variety of refinement orders, inspired by the one of QIF, are explored, providing precise guarantees for max-case leakage and it is shown that, while it is often the case for mechanisms of the same "family" (geometric, randomised response, etc.), it rarely holds across different families.
First and Second Laws of Error
(1923). First and Second Laws of Error. Journal of the American Statistical Association: Vol. 18, No. 143, pp. 841-851.
Mass transportation problems
Modifications of the Monge-Kantorovich Problems: Transportation Problems with Relaxed or Additional Constraints.- Application of Kantorovich-Type Metrics to Various Probabilistic-Type Limit
On the information leakage of differentially-private mechanisms
Differential privacy aims at protecting the privacy of participants in statistical databases. Roughly, a mechanism satisfies differential privacy if the presence or value of a single individual in
A Mathematical Theory of Communication
TLDR
It is proved that the authors can get some positive data rate that has the same small error probability and also there is an upper bound of the data rate, which means they cannot achieve the data rates with any encoding scheme that has small enough error probability over the upper bound.
Differential Privacy Preserving Spectral Graph Analysis
TLDR
It is proved that the sampling procedure achieves differential privacy and the two approaches to computing the e-differential eigen decomposition of the graph’s adjacency matrix under the same differential privacy threshold.
...
...