Differential Privacy with Bounded Priors: Reconciling Utility and Privacy in Genome-Wide Association Studies

Abstract

Differential privacy (DP) has become widely accepted as a rigorous definition of data privacy, with stronger privacy guarantees than traditional statistical methods. However, recent studies have shown that for reasonable privacy budgets, differential privacy significantly affects the expected utility. Many alternative privacy notions which aim at relaxing DP have since been proposed, with the hope of providing a better tradeoff between privacy and utility. At CCS'13, Li et al. introduced the membership privacy framework, wherein they aim at protecting against set membership disclosure by adversaries whose prior knowledge is captured by a family of probability distributions. In the context of this framework, we investigate a relaxation of DP, by considering prior distributions that capture more reasonable amounts of background knowledge. We show that for different privacy budgets, DP can be used to achieve membership privacy for various adversarial settings, thus leading to an interesting tradeoff between privacy guarantees and utility. We re-evaluate methods for releasing differentially private chi<sup>2</sup>-statistics in genome-wide association studies and show that we can achieve a higher utility than in previous works, while still guaranteeing membership privacy in a relevant adversarial setting.

DOI: 10.1145/2810103.2813610

Extracted Key Phrases

8 Figures and Tables

0204020162017
Citations per Year

Citation Velocity: 16

Averaging 16 citations per year over the last 2 years.

Learn more about how we calculate this metric in our FAQ.

Cite this paper

@inproceedings{Tramr2015DifferentialPW, title={Differential Privacy with Bounded Priors: Reconciling Utility and Privacy in Genome-Wide Association Studies}, author={Florian Tram{\`e}r and Zhicong Huang and Jean-Pierre Hubaux and Erman Ayday}, booktitle={ACM Conference on Computer and Communications Security}, year={2015} }