The Power of the Hybrid Model for Mean Estimation

@article{Dubey2020ThePO,
  title={The Power of the Hybrid Model for Mean Estimation},
  author={Yatharth Dubey and Aleksandra Korolova},
  journal={Proceedings on Privacy Enhancing Technologies},
  year={2020},
  volume={2020},
  pages={48 - 68}
}
Abstract We explore the power of the hybrid model of differential privacy (DP), in which some users desire the guarantees of the local model of DP and others are content with receiving the trusted-curator model guarantees. In particular, we study the utility of hybrid model estimators that compute the mean of arbitrary realvalued distributions with bounded support. When the curator knows the distribution’s variance, we design a hybrid estimator that, for realistic datasets and parameter… Expand

Figures and Topics from this paper

A Private and Computationally-Efficient Estimator for Unbounded Gaussians
TLDR
The primary new technical tool in the algorithm is a new differentially private preconditioner that takes samples from an arbitrary Gaussian N(0, Σ) and returns a matrix such that Σ ) has constant condition number. Expand
Advances and Open Problems in Federated Learning
TLDR
Motivated by the explosive growth in FL research, this paper discusses recent advances and presents an extensive collection of open problems and challenges. Expand
Efficient Mean Estimation with Pure Differential Privacy via a Sum-of-Squares Exponential Mechanism
TLDR
This work gives the first polynomial-time algorithm to estimate the mean of a d-variate probability distribution with bounded covariance from Õ(d) independent samples subject to pure differential privacy, and proves a meta-theorem capturing this phenomenon. Expand

References

SHOWING 1-10 OF 70 REFERENCES
Minimax Optimal Procedures for Locally Private Estimation
TLDR
Private versions of classical information-theoretical bounds, in particular those due to Le Cam, Fano, and Assouad, are developed to allow for a precise characterization of statistical rates under local privacy constraints and the development of provably (minimax) optimal estimation procedures. Expand
The power of synergy in differential privacy: Combining a small curator with local randomizers
TLDR
The theoretical study of a hybrid model introduced by "Blender" in which differentially private protocols of n agents that work in the local-model are assisted by a differentiallyPrivate curator that has access to the data of m additional users is initiated. Expand
Average-Case Averages: Private Algorithms for Smooth Sensitivity and Mean Estimation
TLDR
The trimmed mean estimator is proposed, which interpolates between the mean and the median, as a way of attaining much lower sensitivity on average while losing very little in terms of statistical accuracy in the mean estimation problem. Expand
Locally Private Mean Estimation: Z-test and Tight Confidence Intervals
This work provides tight upper- and lower-bounds for the problem of mean estimation under differential privacy in the local-model, when the input is composed of n i.i.d. drawn samples from aExpand
Locally Private Gaussian Estimation
TLDR
This work provides both adaptive two-round and nonadaptive one-round solutions for locally private Gaussian estimation and partially matches these upper bounds with an information-theoretic lower bound. Expand
Extremal Mechanisms for Local Differential Privacy
TLDR
It is shown that for all information theoretic utility functions studied in this paper, maximizing utility is equivalent to solving a linear program, the outcome of which is the optimal staircase mechanism, which is universally optimal in the high and low privacy regimes. Expand
CoinPress: Practical Private Mean and Covariance Estimation
TLDR
This work presents simple differentially private estimators for the mean and covariance of multivariate sub-Gaussian data that are accurate at small sample sizes and shows that their asymptotic error rates match the state-of-the-art theoretical bounds. Expand
Differentially Private k-Means Clustering with Guaranteed Convergence
TLDR
A novel differentially private clustering framework in the interactive settings which controls the orientation of the movement of the centroids over the iterations to ensure the convergence by injecting DP noise in a selected area is proposed. Expand
Smooth sensitivity and sampling in private data analysis
TLDR
This is the first formal analysis of the effect of instance-based noise in the context of data privacy, and shows how to do this efficiently for several different functions, including the median and the cost of the minimum spanning tree. Expand
Calibrating Noise to Sensitivity in Private Data Analysis
TLDR
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output. Expand
...
1
2
3
4
5
...