• Corpus ID: 15712103

Differential Privacy versus Quantitative Information Flow

@article{Alvim2010DifferentialPV,
  title={Differential Privacy versus Quantitative Information Flow},
  author={M{\'a}rio S. Alvim and Konstantinos Chatzikokolakis and Pierpaolo Degano and Catuscia Palamidessi},
  journal={ArXiv},
  year={2010},
  volume={abs/1012.4250}
}
Differential privacy is a notion of privacy that has become very popular in the database community. Roughly, the idea is that a randomized query mechanism provides sufficient privacy protection if the ratio between the probabilities of two different entries to originate a certain answer is bound by e^e. In the fields of anonymity and information flow there is a similar concern for controlling information leakage, i.e. limiting the possibility of inferring the secret information from the… 

Figures and Tables from this paper

Differential Privacy: On the Trade-Off between Utility and Information Leakage

It is shown that the notion of differential privacy implies a bound on utility, also tight, and a method is proposed that under certain conditions builds an optimal randomization mechanism, i.e. a mechanism which provides the best utility while guaranteeing differential privacy.

On the Relation between Differential Privacy and Quantitative Information Flow

This paper analyzes critically the notion of differential privacy in light of the conceptual framework provided by the Renyi min information theory, and shows that there is a close relation between differential privacy and leakage, due to the graph symmetries induced by the adjacency relation.

Quantitative Information Flow and Applications to Differential Privacy

The bound that differential privacy induces on leakage, and the trade-off between utility and privacy are discussed, using the information-theoretic view.

On the information leakage of differentially-private mechanisms

Differential privacy aims at protecting the privacy of participants in statistical databases. Roughly, a mechanism satisfies differential privacy if the presence or value of a single individual in

Quantitative Information Flow and applications

Using the information-theoretic view, the bound that dierential privacy induces on leakage, and the trade-o between utility and privacy are discussed, are discussed.

Information-Theoretic Bounds for Differentially Private Mechanisms

The purpose of this article is to establish formal connections between both notions of confidentiality, and to compare them in terms of the security guarantees they deliver, and it is shown that the question of providing optimal upper bounds for the leakage of eps-differentially private mechanisms in Terms of rational functions of ePS is in fact decidable.

Model for a Common Notion of Privacy Leakage on Public Database

  • S. KiyomotoK. Martin
  • Computer Science
    J. Wirel. Mob. Networks Ubiquitous Comput. Dependable Appl.
  • 2011
A unified model is proposed that is based on the perturbation method, but which is applicable to generalized data sets, and applies the notion of differential privacy to data sets that satisfy k-anonymity.

On Privacy and Accuracy in Data Releases

A model of quantitative information flow is used to describe the the trade-off between privacy of individuals’ data and and the utility of queries to that data by modelling the effectiveness of adversaries attempting to make inferences after a data release.

Differential privacy with information flow control

It is found that information flow for differentiallyPrivate observations is no harder than dependency tracking, allowing the use of existing technology to extend and improve the state of the art for the analysis of differentially private computations.

Common Criterion of Privacy Metrics and Parameters Analysis Based on Error Probability for Randomized Response

The error probability is proposed as a measure of data utility for unifying different privacy metrics, where the error probabilities can be applied to any prior distribution case, while the average distortion criterion is only a special case with uniform distribution.

References

SHOWING 1-10 OF 20 REFERENCES

On the Foundations of Quantitative Information Flow

This paper argues that the consensus definitions of Shannon entropy actually fail to give good security guarantees, and explores an alternative foundation based on a concept of vulnerability and which measures uncertainty using Renyi's min-entropy , rather than Shannon entropy.

Quantitative Notions of Leakage for One-try Attacks

Differential privacy in new settings

New work is described that extends differentially private data analysis beyond the traditional setting of a trusted curator operating, in perfect isolation, on a static dataset, and considers differential privacy under continual observation.

Differential Privacy

  • C. Dwork
  • Computer Science
    Encyclopedia of Cryptography and Security
  • 2006
A general impossibility result is given showing that a formalization of Dalenius' goal along the lines of semantic security cannot be achieved, which suggests a new measure, differential privacy, which, intuitively, captures the increased risk to one's privacy incurred by participating in a database.

Anonymity protocols as noisy channels

Lagrange multipliers and maximum information leakage in different observational models

A uniform definition of leakage is provided, based on Information Theory, that will allow to formalize and prove some intuitive relationships between the amount leaked by the same program in different models and which input distribution causes the maximum leakage.

Anonymity vs. Information Leakage in Anonymity Systems

  • Ye ZhuR. Bettati
  • Computer Science
    25th IEEE International Conference on Distributed Computing Systems (ICDCS'05)
  • 2005
A new measure for the anonymity degree is proposed, which takes into account possible heterogeneity, and is model the effectiveness of single mixes or of mix networks in terms of information leakage and measure it in Terms of covert channel capacity.

Quantitative Information Flow, Relations and Polymorphic Types

With this presentation, it is shown how relational parametricity can be used to derive upper and lower bounds on information flows through families of functions defined in the second-order lambda calculus.

Covert channels and anonymizing networks

An initial inquiry into the relationship between covert channel capacity and anonymity, and how much information a sender to a MIX can leak to an eavesdropping outsider, despite the concealment efforts of MIXes acting as firewalls is investigated.

Belief in information flow

A model is developed that describes how attacker beliefs change due to the attacker's observation of the execution of a probabilistic (or deterministic) program, which leads to a new metric for quantitative information flow that measures accuracy rather than uncertainty of beliefs.