Concurrent Composition of Differential Privacy

@inproceedings{Vadhan2021ConcurrentCO,
  title={Concurrent Composition of Differential Privacy},
  author={Salil P. Vadhan and Tianhao Wang},
  booktitle={IACR Cryptol. ePrint Arch.},
  year={2021}
}
We initiate a study of the composition properties of interactive differentially private mechanisms. An interactive differentially private mechanism is an algorithm that allows an analyst to adaptively ask queries about a sensitive dataset, with the property that an adversarial analyst’s view of the interaction is approximately the same regardless of whether or not any individual’s data is in the dataset. Previous studies of composition of differential privacy have focused on non-interactive… Expand

Figures from this paper

References

SHOWING 1-10 OF 26 REFERENCES
The Composition Theorem for Differential Privacy
TLDR
This paper proves an upper bound on the overall privacy level and construct a sequence of privatization mechanisms that achieves this bound by introducing an operational interpretation of differential privacy and the use of a data processing inequality. Expand
Interactive privacy via the median mechanism
TLDR
The median mechanism is the first privacy mechanism capable of identifying and exploiting correlations among queries in an interactive setting, and an efficient implementation is given, with running time polynomial in the number of queries, the database size, and the domain size. Expand
A Multiplicative Weights Mechanism for Privacy-Preserving Data Analysis
TLDR
A new differentially private multiplicative weights mechanism for answering a large number of interactive counting (or linear) queries that arrive online and may be adaptively chosen, and it is shown that when the input database is drawn from a smooth distribution — a distribution that does not place too much weight on any single data item — accuracy remains as above, and the running time becomes poly-logarithmic in the data universe size. Expand
Distributed Private Data Analysis: On Simultaneously Solving How and What
TLDR
The combination of two directions in the field of privacy concerning computations over distributed private inputs --- secure function evaluation (SFE) and differential privacy is examined, yielding new separations between the local and global models of computations for private data analysis. Expand
The Complexity of Computing the Optimal Composition of Differential Privacy
TLDR
Since computing optimal composition exactly is infeasible unless FP=#P, this work gives an approximation algorithm that computes the composition to arbitrary accuracy in polynomial time and shows that computing the optimal composition in general is #P-complete. Expand
The Algorithmic Foundations of Differential Privacy
TLDR
The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example. Expand
Gaussian Differential Privacy
TLDR
A new relaxation of privacy is proposed, which has a number of appealing properties and, in particular, avoids difficulties associated with divergence based relaxations, and is introduced as `Gaussian differential privacy' (GDP), defined based on testing two shifted Gaussians. Expand
Rényi Differential Privacy
  • Ilya Mironov
  • Computer Science
  • 2017 IEEE 30th Computer Security Foundations Symposium (CSF)
  • 2017
TLDR
This work argues that the useful analytical tool can be used as a privacy definition, compactly and accurately representing guarantees on the tails of the privacy loss, and demonstrates that the new definition shares many important properties with the standard definition of differential privacy. Expand
On the complexity of differentially private data release: efficient algorithms and hardness results
TLDR
Private data analysis in the setting in which a trusted and trustworthy curator releases to the public a "sanitization" of the data set that simultaneously protects the privacy of the individual contributors of data and offers utility to the data analyst is considered. Expand
Our Data, Ourselves: Privacy Via Distributed Noise Generation
TLDR
This work provides efficient distributed protocols for generating shares of random noise, secure against malicious participants, and introduces a technique for distributing shares of many unbiased coins with fewer executions of verifiable secret sharing than would be needed using previous approaches. Expand
...
1
2
3
...