• Corpus ID: 211076277

Guidelines for Implementing and Auditing Differentially Private Systems

@article{Kifer2020GuidelinesFI,
  title={Guidelines for Implementing and Auditing Differentially Private Systems},
  author={Daniel Kifer and Solomon Messing and Aaron Roth and Abhradeep Thakurta and Danfeng Zhang},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.04049}
}
Differential privacy is an information theoretic constraint on algorithms and code. It provides quantification of privacy leakage and formal privacy guarantees that are currently considered the gold standard in privacy protections. In this paper we provide an initial set of "best practices" for developing differentially private platforms, techniques for unit testing that are specific to differential privacy, guidelines for checking if differential privacy is being applied correctly in an… 

Figures from this paper

DDUO: General-Purpose Dynamic Analysis for Differential Privacy
TLDR
The novel core of the DDuo system is formalized and it is proved it sound for sensitivity analysis via a logical relation for metric preservation, which illustrates DDuo's usability and flexibility through various case studies which implement state-of-the-art machine learning algorithms.
Privacy-Preserving Bandits
TLDR
Comparisons of the proposed Privacy-Preserving Bandits system with a non-private, as well as a fully-private (local) system, show competitive performance on both synthetic benchmarks and real-world data, suggesting P2B is an effective approach to challenges arising in on-device privacy-preserving personalization.
Privacy Budget Scheduling
TLDR
PrivateKube is described, an extension to the popular Kubernetes datacenter orchestrator that adds privacy as a new type of resource to be managed alongside other traditional compute resources, such as CPU, GPU, and memory.
Solo: Enforcing Differential Privacy Without Fancy Types
TLDR
This work proposes a new type system that enforces differential privacy, avoids the use of linear and relational refinement types, and can be easily embedded in mainstream richly typed programming languages such as Scala, OCaml and Haskell.
The Second AAAI Workshop on Privacy-Preserving Artificial Intelligence (PPAI-21) Dopamine: Differentially Private Secure Federated Learning on Medical Data
TLDR
Dopamine, a system to train DNNs on distributed medical data, which employs federated learning with differentially-private stochastic gradient descent, and, in combination with secure multi-party aggregation, can establish a better privacy-utility trade-off than the existing approaches.
Differential Privacy for Black-Box Statistical Analyses
We formalize a notion of a privacy wrapper, defined as an algorithm that can take an arbitrary and untrusted script and produce an output with differential privacy guarantees. Our novel privacy
High-Dimensional Differentially-Private EM Algorithm: Methods and Near-Optimal Statistical Guarantees
TLDR
A general framework to design differentially private expectationmaximization algorithms in high-dimensional latent variable models, based on the noisy iterative hard-thresholding, is developed and a near rate-optimal EM algorithm with differential privacy guarantees is proposed.
Differentially Private Histograms under Continual Observation: Streaming Selection into the Unknown
TLDR
A meta-algorithm is presented that can use existing one-shot topk private algorithms as a subroutine to continuously release DP histograms from a stream and more practical DP algorithms for two settings: continuously releasing the top-k counts from a histogram over a known domain when an event can consist of an arbitrary number of items.
The limits of differential privacy (and its misuse in data release and machine learning)
TLDR
Differential privacy is not a silver bullet for all privacy problems, but it can be a step forward in the right direction.
...
1
2
...

References

SHOWING 1-10 OF 77 REFERENCES
Differential Privacy: An Economic Method for Choosing Epsilon
TLDR
A simple model is proposed that expresses the role of differentially private parameters in concrete applications as formulas over a handful of parameters, and is used to choose ε on a series of simple statistical studies.
GUPT: privacy preserving data analysis made easy
TLDR
The design and evaluation of a new system, GUPT, that guarantees differential privacy to programs not developed with privacy in mind, makes no trust assumptions about the analysis program, and is secure to all known classes of side-channel attacks.
A framework for adaptive differential privacy
TLDR
An interpreter for Adaptive Fuzz is described and results from two case studies demonstrating its effectiveness for implementing common statistical algorithms over real data sets are reported.
Differential Privacy by Typing in Security Protocols
  • F. Eigner, Matteo Maffei
  • Computer Science, Mathematics
    2013 IEEE 26th Computer Security Foundations Symposium
  • 2013
TLDR
A symbolic definition of differential privacy for distributed databases is proposed, which takes into account Dolev-Yao intruders and can be used to reason about compromised parties, and a linear, distance-aware type system is developed to statically and automatically enforce distributed differential privacy in cryptographic protocol implementations.
Sampling and partitioning for differential privacy
TLDR
This paper demonstrates an attack on PINQ (McSherry, SIGMOD 2009), one of these tools, relying on the difference between its internal mechanics and the formal theory for the sampling operation, and study a range of sampling methods and show how they can be correctly implemented in a system for differential privacy.
Property Testing For Differential Privacy
  • A. Gilbert, Audra McMillan
  • Computer Science, Mathematics
    2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
  • 2018
TLDR
It is shown that any privacy guarantee that can be efficiently verified is also efficiently breakable in the sense that there exist two databases between which the authors can efficiently distinguish.
The Algorithmic Foundations of Differential Privacy
TLDR
The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example.
Probabilistic Relational Reasoning for Differential Privacy
TLDR
The central component of CertiPriv is a quantitative extension of probabilistic relational Hoare logic that enables one to derive differential privacy guarantees for programs from first principles, and provides the first machine-checked proofs of correctness of the Laplacian, Gaussian, and exponential mechanisms and of the privacy of randomized and streaming algorithms from the literature.
Linear dependent types for differential privacy
TLDR
DFuzz is presented, an extension of Fuzz with a combination of linear indexed types and lightweight dependent types that allows a richer sensitivity analysis that is able to certify a larger class of queries as differentially private, including ones whose sensitivity depends on runtime information.
EKTELO: A Framework for Defining Differentially-Private Computations
TLDR
This work proposes a novel programming framework and system, Ektelo, for implementing both existing and new privacy algorithms, and shows that nearly all existing algorithms can be composed from operators, each conforming to one of a small number of operator classes.
...
1
2
3
4
5
...