On the Relation between Differential Privacy and Quantitative Information Flow

  title={On the Relation between Differential Privacy and Quantitative Information Flow},
  author={M{\'a}rio S. Alvim and Miguel E. Andr{\'e}s and Konstantinos Chatzikokolakis and Catuscia Palamidessi},
Differential privacy is a notion that has emerged in the community of statistical databases, as a response to the problem of protecting the privacy of the database's participants when performing statistical queries. The idea is that a randomized query satisfies differential privacy if the likelihood of obtaining a certain answer for a database x is not too different from the likelihood of obtaining the same answer on adjacent databases, i.e. databases which differ from x for only one individual… 
On the information leakage of differentially-private mechanisms
Differential privacy aims at protecting the privacy of participants in statistical databases. Roughly, a mechanism satisfies differential privacy if the presence or value of a single individual in
Quantitative Information Flow and Applications to Differential Privacy
The bound that differential privacy induces on leakage, and the trade-off between utility and privacy are discussed, using the information-theoretic view.
Quantitative Approaches to Information Protection
This talk revise the main recent approaches which have been proposed to quantify and reason about leakage, the information-theoretic approaches based on Shannon entropy and on Renyi min-entropy, and a novel one based on decision theory.
A Privacy-Preserving Game Model for Local Differential Privacy by Using Information-Theoretic Approach
This paper proposes a privacy preserving attack and defense (PPAD) game framework, that is, a two- person zero-sum (TPZS) game, and develops an alternating optimization algorithm to compute the saddle point of the proposed PPAD game.
Information-theoretic models of confidentiality and privacy
A strong semantic notion of security is introduced, that expresses absence of any privacy breach above a given level of seriousness, irrespective of any background information, that is to protection of sensitive information concerning individuals.
Identifying the optimal di ff erential private mechanisms for di ff erent users ?
This work describes, for a general query and privacy level, a randomisation mechanism which satisfies differential privacy and at the same time is optimal for a class of users having various priors, and presents the properties of this mechanism in terms of utility and privacy.
Linear dependent types for differential privacy
DFuzz is presented, an extension of Fuzz with a combination of linear indexed types and lightweight dependent types that allows a richer sensitivity analysis that is able to certify a larger class of queries as differentially private, including ones whose sensitivity depends on runtime information.
Development and Analysis of Deterministic Privacy-Preserving Policies Using Non- Stochastic Information Theory
  • Farhad Farokhi
  • Computer Science
    IEEE Transactions on Information Forensics and Security
  • 2019
The measure of privacy is used to analyze $k$ -anonymity (a popular deterministic mechanism for privacy-preserving release of datasets using suppression and generalization techniques), proving that it is in fact not privacy preserving.
Worst- and Average-Case Privacy Breaches in Randomization Mechanisms
The starting point is a semantic notion of security that expresses absence of any privacy breach above a given level of seriousness e, irrespective of any background information, represented as a prior probability on the secret inputs.
Beyond Differential Privacy: Composition Theorems and Relational Logic for f-divergences between Probabilistic Programs
This paper observes that the notion of α-distance used to characterize approximate differential privacy is an instance of the family of f-divergences, and proposes a relational program logic to prove upper bounds for the f-Divergence between two probabilistic programs.


Differential Privacy versus Quantitative Information Flow
It is shown how to model the query system in terms of an information-theoretic channel, and it is shown that the notion of differential privacy is strictly stronger, in the sense that it implies a bound on the mutual information, but not viceversa.
Differential Privacy: On the Trade-Off between Utility and Information Leakage
It is shown that the notion of differential privacy implies a bound on utility, also tight, and a method is proposed that under certain conditions builds an optimal randomization mechanism, i.e. a mechanism which provides the best utility while guaranteeing differential privacy.
Information-Theoretic Bounds for Differentially Private Mechanisms
The purpose of this article is to establish formal connections between both notions of confidentiality, and to compare them in terms of the security guarantees they deliver, and it is shown that the question of providing optimal upper bounds for the leakage of eps-differentially private mechanisms in Terms of rational functions of ePS is in fact decidable.
Universally utility-maximizing privacy mechanisms
Every potential user u, no matter what its side information and preferences, derives as much utility from M* as from interacting with a differentially private mechanism Mu that is optimally tailored to u, subject to differential privacy.
On the Foundations of Quantitative Information Flow
This paper argues that the consensus definitions of Shannon entropy actually fail to give good security guarantees, and explores an alternative foundation based on a concept of vulnerability and which measures uncertainty using Renyi's min-entropy , rather than Shannon entropy.
Differential privacy in new settings
New work is described that extends differentially private data analysis beyond the traditional setting of a trusted curator operating, in perfect isolation, on a static dataset, and considers differential privacy under continual observation.
Differential Privacy
A general impossibility result is given showing that a formalization of Dalenius' goal along the lines of semantic security cannot be achieved, which suggests a new measure, differential privacy, which, intuitively, captures the increased risk to one's privacy incurred by participating in a database.
Quantitative Notions of Leakage for One-try Attacks
Probability of Error in Information-Hiding Protocols
A constructive characterization of a convex base of the probability of error is presented, which allows us to compute its maximum value (over all possible input distributions), and to identify upper bounds for it in terms of simple functions.
Asymptotic Information Leakage under One-Try Attacks
The asymptotic behaviour of quantities (a) and (b) can be determined in a simple way from the channel matrix, and simple and tight bounds on them as functions of n show that the convergence is exponential.