Differential Privacy versus Quantitative Information Flow
@article{Alvim2010DifferentialPV, title={Differential Privacy versus Quantitative Information Flow}, author={M{\'a}rio S. Alvim and Konstantinos Chatzikokolakis and Pierpaolo Degano and Catuscia Palamidessi}, journal={ArXiv}, year={2010}, volume={abs/1012.4250} }
Differential privacy is a notion of privacy that has become very popular in the database community. Roughly, the idea is that a randomized query mechanism provides sufficient privacy protection if the ratio between the probabilities of two different entries to originate a certain answer is bound by e^e. In the fields of anonymity and information flow there is a similar concern for controlling information leakage, i.e. limiting the possibility of inferring the secret information from the…
17 Citations
Differential Privacy: On the Trade-Off between Utility and Information Leakage
- Computer ScienceFormal Aspects in Security and Trust
- 2011
It is shown that the notion of differential privacy implies a bound on utility, also tight, and a method is proposed that under certain conditions builds an optimal randomization mechanism, i.e. a mechanism which provides the best utility while guaranteeing differential privacy.
On the Relation between Differential Privacy and Quantitative Information Flow
- Computer ScienceICALP
- 2011
This paper analyzes critically the notion of differential privacy in light of the conceptual framework provided by the Renyi min information theory, and shows that there is a close relation between differential privacy and leakage, due to the graph symmetries induced by the adjacency relation.
Quantitative Information Flow and Applications to Differential Privacy
- Computer ScienceFOSAD
- 2011
The bound that differential privacy induces on leakage, and the trade-off between utility and privacy are discussed, using the information-theoretic view.
On the information leakage of differentially-private mechanisms
- Computer ScienceJ. Comput. Secur.
- 2015
Differential privacy aims at protecting the privacy of participants in
statistical databases. Roughly, a mechanism satisfies differential privacy if
the presence or value of a single individual in…
Quantitative Information Flow and applications
- Computer Science
- 2011
Using the information-theoretic view, the bound that dierential privacy induces on leakage, and the trade-o between utility and privacy are discussed, are discussed.
Information-Theoretic Bounds for Differentially Private Mechanisms
- Computer Science2011 IEEE 24th Computer Security Foundations Symposium
- 2011
The purpose of this article is to establish formal connections between both notions of confidentiality, and to compare them in terms of the security guarantees they deliver, and it is shown that the question of providing optimal upper bounds for the leakage of eps-differentially private mechanisms in Terms of rational functions of ePS is in fact decidable.
Model for a Common Notion of Privacy Leakage on Public Database
- Computer ScienceJ. Wirel. Mob. Networks Ubiquitous Comput. Dependable Appl.
- 2011
A unified model is proposed that is based on the perturbation method, but which is applicable to generalized data sets, and applies the notion of differential privacy to data sets that satisfy k-anonymity.
On Privacy and Accuracy in Data Releases
- Computer Science
- 2020
A model of quantitative information flow is used to describe the the trade-off between privacy of individuals’ data and and the utility of queries to that data by modelling the effectiveness of adversaries attempting to make inferences after a data release.
Differential privacy with information flow control
- Computer Science, MathematicsPLAS '11
- 2011
It is found that information flow for differentiallyPrivate observations is no harder than dependency tracking, allowing the use of existing technology to extend and improve the state of the art for the analysis of differentially private computations.
Common Criterion of Privacy Metrics and Parameters Analysis Based on Error Probability for Randomized Response
- Computer ScienceIEEE Access
- 2019
The error probability is proposed as a measure of data utility for unifying different privacy metrics, where the error probabilities can be applied to any prior distribution case, while the average distortion criterion is only a special case with uniform distribution.
References
SHOWING 1-10 OF 20 REFERENCES
On the Foundations of Quantitative Information Flow
- Computer ScienceFoSSaCS
- 2009
This paper argues that the consensus definitions of Shannon entropy actually fail to give good security guarantees, and explores an alternative foundation based on a concept of vulnerability and which measures uncertainty using Renyi's min-entropy , rather than Shannon entropy.
Differential privacy in new settings
- Computer ScienceSODA '10
- 2010
New work is described that extends differentially private data analysis beyond the traditional setting of a trusted curator operating, in perfect isolation, on a static dataset, and considers differential privacy under continual observation.
Differential Privacy
- Computer ScienceEncyclopedia of Cryptography and Security
- 2006
A general impossibility result is given showing that a formalization of Dalenius' goal along the lines of semantic security cannot be achieved, which suggests a new measure, differential privacy, which, intuitively, captures the increased risk to one's privacy incurred by participating in a database.
Lagrange multipliers and maximum information leakage in different observational models
- Computer SciencePLAS '08
- 2008
A uniform definition of leakage is provided, based on Information Theory, that will allow to formalize and prove some intuitive relationships between the amount leaked by the same program in different models and which input distribution causes the maximum leakage.
Anonymity vs. Information Leakage in Anonymity Systems
- Computer Science25th IEEE International Conference on Distributed Computing Systems (ICDCS'05)
- 2005
A new measure for the anonymity degree is proposed, which takes into account possible heterogeneity, and is model the effectiveness of single mixes or of mix networks in terms of information leakage and measure it in Terms of covert channel capacity.
Quantitative Information Flow, Relations and Polymorphic Types
- Computer ScienceJ. Log. Comput.
- 2005
With this presentation, it is shown how relational parametricity can be used to derive upper and lower bounds on information flows through families of functions defined in the second-order lambda calculus.
Covert channels and anonymizing networks
- Computer ScienceWPES '03
- 2003
An initial inquiry into the relationship between covert channel capacity and anonymity, and how much information a sender to a MIX can leak to an eavesdropping outsider, despite the concealment efforts of MIXes acting as firewalls is investigated.
Belief in information flow
- Computer Science18th IEEE Computer Security Foundations Workshop (CSFW'05)
- 2005
A model is developed that describes how attacker beliefs change due to the attacker's observation of the execution of a probabilistic (or deterministic) program, which leads to a new metric for quantitative information flow that measures accuracy rather than uncertainty of beliefs.