Amplification by Shuffling: From Local to Central Differential Privacy via Anonymity

@inproceedings{Erlingsson2019AmplificationBS,
  title={Amplification by Shuffling: From Local to Central Differential Privacy via Anonymity},
  author={{\'U}lfar Erlingsson and Vitaly Feldman and Ilya Mironov and Ananth Raghunathan and Kunal Talwar and Abhradeep Thakurta},
  booktitle={SODA},
  year={2019}
}
Sensitive statistics are often collected across sets of users, with repeated collection of reports done over time. [...] Key Result As a practical corollary, our results imply that several LDP-based industrial deployments may have much lower privacy cost than their advertised e would indicate---at least if reports are anonymized.Expand
Distributed Differential Privacy via Shuffling
TLDR
Evidence that the power of the shuffled model lies strictly between those of the central and local models is given: for a natural restriction of the model, it is shown that shuffled protocols for a widely studied selection problem require exponentially higher sample complexity than do central-model protocols. Expand
Privacy Profiles and Amplification by Subsampling
TLDR
The privacy profiles machinery are applied to study the so-called ``privacy amplification by subsampling'' principle, which ensures that a differentially private mechanism run on a random subsample of a population provides higher privacy guarantees than when run on the entire population. Expand
Manipulation Attacks in Local Differential Privacy
TLDR
It is shown that any non-interactive locally differentially private protocol can be manipulated to a much greater extent when the privacy level is high or the input domain is large, and the importance of efficient cryptographic techniques for emulating mechanisms from central differential privacy in distributed settings is reinforced. Expand
Local Differential Privacy for Evolving Data
TLDR
A new technique for local differential privacy is introduced that makes it possible to maintain up-to-date statistics over time, with privacy guarantees that degrade only in the number of changes in the underlying distribution rather than thenumber of collection periods. Expand
POSTER: Data Collection via Local Differential Privacy with Secret Parameters
TLDR
This paper studies how the privacy level and utility change in a new privacy model, Parameter Blending Privacy, that the data providers keep their privacy parameter secret, and concludes that this manipulation amplifies the privacylevel with a small loss of utility, so it improves utility-privacy trade-off. Expand
Hiding Numerical Vectors in Local Private and Shuffled Messages
  • Shaowei Wang, Jin Li, Yuqiu Qian, Jiachun Du, Wenqing Lin, Wei Yang
  • Computer Science
  • IJCAI
  • 2021
Numerical vector aggregation has numerous applications in privacy-sensitive scenarios, such as distributed gradient estimation in federated learning, and statistical analysis on key-value data.Expand
On the Renyi Differential Privacy of the Shuffle Model
TLDR
The principal result in this paper is the first non-trivial RDP guarantee for general discrete local randomization mechanisms in the shuffled privacy model, and new analysis techniques for deriving the results which could be of independent interest. Expand
Renyi Differential Privacy of the Subsampled Shuffle Model in Distributed Learning
TLDR
A privacy-optimization performance trade-off for discrete randomization mechanisms in this sub-sampled shuffle privacy model enables significant improvement in privacy guarantee over the state-of-the-art approximate Differential Privacy (DP) guarantee (with strong composition) for sub- sampled shuffled models. Expand
Hiding Among the Clones: A Simple and Nearly Optimal Analysis of Privacy Amplification by Shuffling
TLDR
This work shows that an ε0-locally differentially private algorithm, under shuffling with n users, amplifies to a (Θ((1 − e−ε0) √ eε0 log(1/δ) n ), δ)-central differential privacy guarantee, which significantly improves over previous work and achieves the asymptotically optimal dependence onε0. Expand
Estimating Numerical Distributions under Local Differential Privacy
TLDR
This work introduces a new reporting mechanism, called the square wave (SW) mechanism, which exploits the numerical nature in reporting and develops an Expectation Maximization with Smoothing algorithm, which is applied to aggregated histograms from the SW mechanism to estimate the original distributions. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 42 REFERENCES
Differential privacy under continual observation
TLDR
This work identifies the problem of maintaining a counter in a privacy preserving manner and shows its wide applicability to many different problems. Expand
Distributed Differential Privacy via Mixnets
TLDR
A mixnet model for distributed differentially private algorithms, which lies between the local and central models, is proposed and it is shown that mixnet protocols for a widely studied selection problem require exponentially higher sample complexity than do central-model protocols. Expand
Local Differential Privacy for Evolving Data
TLDR
A new technique for local differential privacy is introduced that makes it possible to maintain up-to-date statistics over time, with privacy guarantees that degrade only in the number of changes in the underlying distribution rather than thenumber of collection periods. Expand
The Algorithmic Foundations of Differential Privacy
TLDR
The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example. Expand
Our Data, Ourselves: Privacy Via Distributed Noise Generation
TLDR
This work provides efficient distributed protocols for generating shares of random noise, secure against malicious participants, and introduces a technique for distributing shares of many unbiased coins with fewer executions of verifiable secret sharing than would be needed using previous approaches. Expand
Privacy Loss in Apple's Implementation of Differential Privacy on MacOS 10.12
TLDR
It is found that although Apple's deployment ensures that the (differential) privacy loss per each datum submitted to its servers is $1 or $2, the overall privacy loss permitted by the system is significantly higher, as high as $16$ per day for the four initially announced applications of Emojis, New words, Deeplinks and Lookup Hints. Expand
Local, Private, Efficient Protocols for Succinct Histograms
TLDR
Efficient protocols and matching accuracy lower bounds for frequency estimation in the local model for differential privacy are given and it is shown that each user need only send 1 bit to the server in a model with public coins. Expand
Locally Differentially Private Protocols for Frequency Estimation
TLDR
This paper introduces a framework that generalizes several LDP protocols proposed in the literature and yields a simple and fast aggregation algorithm, whose accuracy can be precisely analyzed, resulting in two new protocols that provide better utility than protocols previously proposed. Expand
Privacy Amplification by Subsampling: Tight Analyses via Couplings and Divergences
TLDR
This paper presents a general method that recovers and improves prior analyses, yields lower bounds and derives new instances of privacy amplification by subsampling, which leverages a characterization of differential privacy as a divergence which emerged in the program verification community. Expand
Calibrating Noise to Sensitivity in Private Data Analysis
TLDR
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output. Expand
...
1
2
3
4
5
...