Making the Most of Parallel Composition in Differential Privacy

@article{Smith2021MakingTM,
  title={Making the Most of Parallel Composition in Differential Privacy},
  author={Joshua Smith and Hassan Jameel Asghar and Gianpaolo Gioiosa and Sirine Mrabet and Serge Gaspers and Paul Tyler},
  journal={Proceedings on Privacy Enhancing Technologies},
  year={2021},
  volume={2022},
  pages={253 - 273}
}
Abstract We show that the ‘optimal’ use of the parallel composition theorem corresponds to finding the size of the largest subset of queries that ‘overlap’ on the data domain, a quantity we call the maximum overlap of the queries. It has previously been shown that a certain instance of this problem, formulated in terms of determining the sensitivity of the queries, is NP-hard, but also that it is possible to use graph-theoretic algorithms, such as finding the maximum clique, to approximate… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 43 REFERENCES
Sensitivity Analysis for Non-Interactive Differential Privacy: Bounds and Efficient Algorithms
TLDR
This work proposes upper bounds on sensitivity that are tighter than those in previous work, and introduces methods that build a graph model based on a query set, such that implementing the aforementioned bounds can be achieved by solving two well-known clique problems. Expand
The Complexity of Computing the Optimal Composition of Differential Privacy
TLDR
Since computing optimal composition exactly is infeasible unless FP=#P, this work gives an approximation algorithm that computes the composition to arbitrary accuracy in polynomial time and shows that computing the optimal composition in general is #P-complete. Expand
Analyzing Graphs with Node Differential Privacy
TLDR
A generic, efficient reduction is derived that allows us to apply any differentially private algorithm for bounded-degree graphs to an arbitrary graph, based on analyzing the smooth sensitivity of the 'naive' truncation that simply discards nodes of high degree. Expand
A Multiplicative Weights Mechanism for Privacy-Preserving Data Analysis
TLDR
A new differentially private multiplicative weights mechanism for answering a large number of interactive counting (or linear) queries that arrive online and may be adaptively chosen, and it is shown that when the input database is drawn from a smooth distribution — a distribution that does not place too much weight on any single data item — accuracy remains as above, and the running time becomes poly-logarithmic in the data universe size. Expand
The Composition Theorem for Differential Privacy
TLDR
This paper proves an upper bound on the overall privacy level and construct a sequence of privatization mechanisms that achieves this bound by introducing an operational interpretation of differential privacy and the use of a data processing inequality. Expand
The Algorithmic Foundations of Differential Privacy
TLDR
The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example. Expand
EKTELO: A Framework for Defining Differentially-Private Computations
TLDR
This work proposes a novel programming framework and system, Ektelo, for implementing both existing and new privacy algorithms, and shows that nearly all existing algorithms can be composed from operators, each conforming to one of a small number of operator classes. Expand
Optimizing error of high-dimensional statistical queries under differential privacy
TLDR
HDRM is proposed, a new differentially private algorithm for answering a workload of predicate counting queries, that is especially effective for higher-dimensional datasets and can efficiently answer queries with lower error than state-of-the-art techniques on a variety of low and high dimensional datasets. Expand
Gaussian Differential Privacy
TLDR
A new relaxation of privacy is proposed, which has a number of appealing properties and, in particular, avoids difficulties associated with divergence based relaxations, and is introduced as `Gaussian differential privacy' (GDP), defined based on testing two shifted Gaussians. Expand
Output perturbation with query relaxation
TLDR
This work considers the problem of constructing a statistical database using output perturbation, which protects privacy by injecting a small noise into each query result, and develops a new technique that enforces e-different privacy with economical cost. Expand
...
1
2
3
4
5
...