Corpus ID: 148567614

Bootstrap Differential Privacy

  title={Bootstrap Differential Privacy},
  author={Christine M. O'Keefe and Anne-Sophie Charest},
  journal={Trans. Data Priv.},
This paper concerns the challenge of protecting confidentiality while making statistically useful data and analytical outputs available for research and policy analysis. In this context, the confidentiality protection measure known as differential privacy is an attractive methodology because of its clear definition and the strong guarantees that it promises. However, concerns about differential privacy include the possibility that in some situations the guarantees may be so strong that… Expand


Differential Privacy and the Risk-Utility Tradeoff for Multi-dimensional Contingency Tables
This paper explores how well the mechanism works in the context of a series of examples, and the extent to which the proposed differential-privacy mechanism allows for sensible inferences from the released data. Expand
Towards a Systematic Analysis of Privacy Definitions
A novel methodology for analyzing the Bayesian properties of a privacy definition is added, its goal is to identify precisely the type of information being protected, hence making it easier to identify (and later remove) unnecessary data protections. Expand
Privacy, accuracy, and consistency too: a holistic solution to contingency table release
This work proposes a solution that provides strong guarantees for all three desiderata simultaneously, privacy, accuracy, and consistency among the tables, and applies equally well to the logical cousin of the contingency table, the OLAP cube. Expand
The Algorithmic Foundations of Differential Privacy
The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example. Expand
On the Meaning and Limits of Empirical Differential Privacy
It is shown that EDP is not well-defined, in that its value depends crucially on the choice of discretization used in the procedure, and that it can be very computationnaly intensive to apply in practice. Expand
Differential Privacy
A general impossibility result is given showing that a formalization of Dalenius' goal along the lines of semantic security cannot be achieved, which suggests a new measure, differential privacy, which, intuitively, captures the increased risk to one's privacy incurred by participating in a database. Expand
Smooth sensitivity and sampling in private data analysis
This is the first formal analysis of the effect of instance-based noise in the context of data privacy, and shows how to do this efficiently for several different functions, including the median and the cost of the minimum spanning tree. Expand
Calibrating Noise to Sensitivity in Private Data Analysis
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output. Expand
Privacy: Theory meets Practice on the Map
In this paper, we propose the first formal privacy analysis of a data anonymization process known as the synthetic data generation, a technique becoming popular in the statistics community. TheExpand
Random Differential Privacy
It is shown that RDP histograms are much more accurate than histograms obtained using ordinary differential privacy, and an analog of the global sensitivity framework for the release of functions under the privacy definition is shown. Expand