Composition of Differential Privacy & Privacy Amplification by Subsampling

  title={Composition of Differential Privacy \& Privacy Amplification by Subsampling},
  author={Thomas Steinke},
  • T. Steinke
  • Published 2 October 2022
  • Computer Science
  • ArXiv
This chapter is meant to be part of the book “Differential Privacy for Artificial Intelligence Applications.” We give an introduction to the most important property of differential privacy – composition: running multiple independent analyses on the data of a set of people will still be differentially private as long as each of the analyses is private on its own – as well as the related topic of privacy amplification by subsampling. This chapter introduces the basic concepts and gives proofs of… 

Figures from this paper

Improved Differential Privacy for SGD via Optimal Private Linear Operators on Adaptive Streams

This work instantiate this framework with respect to concrete matrices which arise naturally in machine learning, and train user-level differentially private models with the resulting optimal mechanisms, yielding significant improvements in a notable problem in federated learning with user- level differential privacy.

Practical Differentially Private Hyperparameter Tuning with Subsampling

This work focuses on lowering both the DP bounds and the computational complexity of these methods by using only a random subset of the sensitive data for the hyperparameter tuning and by extrapolating the optimal values from the small dataset to a larger dataset.

Lemmas of Differential Privacy

We aim to collect buried lemmas that are useful for proofs. In particular, we try to provide self-contained proofs for those lemmas and categorise them according to their usage.



Amplification by Shuffling: From Local to Central Differential Privacy via Anonymity

It is shown, via a new and general privacy amplification technique, that any permutation-invariant algorithm satisfying e-local differential privacy will satisfy [MATH HERE]-central differential privacy.

Privacy Amplification by Subsampling: Tight Analyses via Couplings and Divergences

This paper presents a general method that recovers and improves prior analyses, yields lower bounds and derives new instances of privacy amplification by subsampling, which leverages a characterization of differential privacy as a divergence which emerged in the program verification community.

Individual Privacy Accounting via a Renyi Filter

This work gives a method for tighter privacy loss accounting based on the value of a personalized privacy loss estimate for each individual in each analysis, based on a new composition theorem for R\'enyi differential privacy, which allows adaptively-chosen privacy parameters.

Gaussian differential privacy

The privacy guarantees of any hypothesis testing based definition of privacy (including the original differential privacy definition) converges to GDP in the limit under composition and a Berry–Esseen style version of the central limit theorem is proved, which gives a computationally inexpensive tool for tractably analysing the exact composition of private algorithms.

Stronger Privacy Amplification by Shuffling for Rényi and Approximate Differential Privacy

The shuffle model of differential privacy has gained significant interest as an intermediate trust model between the standard local and central models and leads to tighter numerical bounds in all parameter settings.

Optimal Accounting of Differential Privacy via Characteristic Function

This work proposes acation of recent advances of Renyi DP, privacy profiles, f -DP and the PLD formalism via the characteristic function of a certain dominating privacy loss random variable, and proposes an analytical Fourier accountant that represents the complex logarithm of φ functions symbolically and uses Gaussian quadrature for numerical computation.

Privacy Odometers and Filters: Pay-as-you-Go Composition

The study of adaptive composition in differential privacy when the length of the composition, and the privacy parameters themselves can be chosen adaptively, as a function of the outcome of previously run analyses is initiated.

Concurrent Composition of Differential Privacy

It is proved that when the interactive mechanisms being composed are pure differentially private, their concurrent composition achieves privacy parameters (with respect to pure or approximate differential privacy) that match the (optimal) composition theorem for noninteractive differential privacy.

Calibrating Noise to Sensitivity in Private Data Analysis

The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.

Poission Subsampled Rényi Differential Privacy

This work addresses the problem of "privacyamplification by subsampling” under the Renyi Differential Privacy (RDP) framework and proves an exact analytical formula for the case when M is the Gaussian mechanism or the Laplace mechanism.