The Algorithmic Foundations of Differential Privacy

@article{Dwork2014TheAF,
  title={The Algorithmic Foundations of Differential Privacy},
  author={Cynthia Dwork and Aaron Roth},
  journal={Found. Trends Theor. Comput. Sci.},
  year={2014},
  volume={9},
  pages={211-407}
}
  • C. Dwork, Aaron Roth
  • Published 11 August 2014
  • Computer Science
  • Found. Trends Theor. Comput. Sci.
The problem of privacy-preserving data analysis has a long history spanning multiple disciplines. [...] Key Result Despite some astonishingly powerful computational results, there are still fundamental limitations — not just on what can be achieved with differential privacy but on what can be achieved with any method that protects against a complete breakdown in privacy. Virtually all the algorithms discussed herein maintain differential privacy against adversaries of arbitrary computational power. Certain…Expand
A framework for adaptive differential privacy
TLDR
An interpreter for Adaptive Fuzz is described and results from two case studies demonstrating its effectiveness for implementing common statistical algorithms over real data sets are reported.
New Separations in the Complexity of Differential Privacy
TLDR
It is shown, for the first time, that approximate differential privacy can demand higher sample complexity than what is needed to ensure statistical accuracy alone, and it is shown that a price of privacy even for low-dimensional query families is revealed.
Data Privacy Beyond Differential Privacy
TLDR
This dissertation provides privacy-preserving algorithms for solving a family of economic optimization problems under a strong relaxation of the standard definition of differential privacy---joint differential privacy, and shows that (joint) differential privacy can serve as a novel tool for mechanism design when solving these optimization problems.
Flexible Accuracy for Differential Privacy
TLDR
The main contribution is in augmenting differential privacy with Flexible Accuracy, which allows small distortions in the input before measuring accuracy of the output, allowing one to extend DP mechanisms to high-sensitivity functions.
Privacy-Preserving Parametric Inference: A Case for Robust Statistics
TLDR
It is demonstrated that differential privacy is a weaker stability requirement than infinitesimal robustness, and it is shown that robust M-estimators can be easily randomized to guarantee both differential privacy and robustness toward the presence of contaminated data.
Constrained Differential Privacy for Count Data
TLDR
This work focuses on the core problem of count queries, and seeks to design mechanisms to release data associated with a group of n individuals by introducing a set of desirable properties that mechanisms can obey.
Plausible Deniability for Privacy-Preserving Data Synthesis
TLDR
This paper presents a criterion called plausible deniability that provides a formal privacy guarantee, notably for releasing sensitive datasets: an output record can be released only if a certain amount of input records are indistinguishable, up to a privacy parameter.
Privacy Profiles and Amplification by Subsampling
TLDR
The privacy profiles machinery are applied to study the so-called ``privacy amplification by subsampling'' principle, which ensures that a differentially private mechanism run on a random subsample of a population provides higher privacy guarantees than when run on the entire population.
Differential Privacy: A Primer for a Non-Technical Audience
TLDR
This primer aims to provide a foundation that can guide future decisions when analyzing and sharing statistical data about individuals, informing individuals about the privacy protection they will be afforded, and designing policies and regulations for robust privacy protection.
Advanced Probabilistic Couplings for Differential Privacy
TLDR
A new formalism extending apRHL, a relational program logic that has been used for proving differential privacy of non-interactive algorithms, and incorporating a HL, a (non-relational) program logic for accuracy properties is addressed, which exemplifies the three classes of algorithms and explores new variants of the Sparse Vector technique.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 99 REFERENCES
Differential privacy under continual observation
TLDR
This work identifies the problem of maintaining a counter in a privacy preserving manner and shows its wide applicability to many different problems.
Computational Differential Privacy
TLDR
This work extends the dense model theorem of Reingold et al. to demonstrate equivalence between two definitions (indistinguishability- and simulatability-based) of computational differential privacy, and presents a differentially-private protocol for computing the distance between two vectors.
Limits of Computational Differential Privacy in the Client/Server Setting
TLDR
It is shown, for queries with output in Rn and with respect to a large class of utilities, that any computationally private mechanism can be converted to a statistically private mechanism that is equally efficient and achieves roughly the same utility.
Lower Bounds in Differential Privacy
TLDR
This paper combines the techniques of Hardt and Talwar [11] and McGregor et al.
The Limits of Two-Party Differential Privacy
TLDR
Borders expose a dramatic gap between the accuracy that can be obtained by differentially private data analysis versus the accuracy obtainable when privacy is relaxed to a computational variant of differential privacy.
The Privacy of the Analyst and the Power of the State
TLDR
It is argued that the problem is real by proving an exponential gap between the number of queries that can be answered (with non-trivial error) by stateless and stateful differentially private mechanisms.
A Multiplicative Weights Mechanism for Privacy-Preserving Data Analysis
TLDR
A new differentially private multiplicative weights mechanism for answering a large number of interactive counting (or linear) queries that arrive online and may be adaptively chosen, and it is shown that when the input database is drawn from a smooth distribution — a distribution that does not place too much weight on any single data item — accuracy remains as above, and the running time becomes poly-logarithmic in the data universe size.
On the complexity of differentially private data release: efficient algorithms and hardness results
TLDR
Private data analysis in the setting in which a trusted and trustworthy curator releases to the public a "sanitization" of the data set that simultaneously protects the privacy of the individual contributors of data and offers utility to the data analyst is considered.
Sample Complexity Bounds for Differentially Private Learning
TLDR
An upper bound on the sample requirement of learning with label privacy is provided that depends on a measure of closeness between and the unlabeled data distribution and applies to the non-realizable as well as the realizable case.
Is privacy compatible with truthfulness?
  • David Xiao
  • Mathematics, Computer Science
    ITCS '13
  • 2013
TLDR
It is shown that mechanisms that release a perturbed histogram of the database may reveal too much information, and it implies that there exists a differentially private, truthful, and approximately efficient mechanism for any social welfare game with small type space.
...
1
2
3
4
5
...