• Corpus ID: 235266257

Instance-optimal Mean Estimation Under Differential Privacy

  title={Instance-optimal Mean Estimation Under Differential Privacy},
  author={Ziyue Huang and Yuting Liang and Ke Yi},
Mean estimation under differential privacy is a fundamental problem, but worst-case optimal mechanisms do not offer meaningful utility guarantees in practice when the global sensitivity is very large. Instead, various heuristics have been proposed to reduce the error on real-world data that do not resemble the worst-case instance. This paper takes a principled approach, yielding a mechanism that is instance-optimal in a strong sense. In addition to its theoretical optimality, the mechanism is… 

Efficient mean estimation with pure differential privacy via a sum-of-squares exponential mechanism

This work gives the first polynomial-time algorithm to estimate the mean of a d-variate probability distribution with bounded covariance from Õ(d) independent samples subject to pure differential privacy, and proves a meta-theorem capturing this phenomenon.

Universal Private Estimators

For certain distribution families like Gaussians or heavy-tailed distributions, it is shown that the universal estimators presented match or improve existing estimators, which are often specifically designed for the given family and under priori boundedness assumptions on the mean and variance of P .

Differentially Private Covariance Revisited

In this paper, we present two new algorithms for covariance estimation under concentrated differential privacy (zCDP). The first algorithm achieves a Frobenius error of ˜ O p d 1 { 4 ? tr {? n ` ? d {

C R ] 4 N ov 2 02 1 Universal Private Estimators

We present universal estimators for the statistical mean, variance, and scale (in particular, the interquartile range) under pure differential privacy. These estimators are universal in the sense

A Nearly Instance-optimal Differentially Private Mechanism for Conjunctive Queries

This work provides the first DP mechanism for this problem with a fairly strong notion of optimality, which can be considered as a natural relaxation of instance-optimality to a constant.

Covariance-Aware Private Mean Estimation Without Private Covariance Estimation

Two sample-efficient differentially private mean estimators for ddimensional (sub)Gaussian distributions with unknown covariance are presented, and sample complexity guarantees hold more generally for subgaussian distributions, albeit with a slightly worse dependence on the privacy parameter.

R2T: Instance-optimal Truncation for Differentially Private Query Evaluation with Foreign Keys

This paper proposes the first DP mechanism for answering arbitrary SPJA queries in a database with foreign-key constraints, and shows that it offers order-of-magnitude improvements in terms of utility over existing techniques, even those specifically designed for graph pattern counting.

New Lower Bounds for Private Estimation and a Generalized Fingerprinting Lemma

New lower bounds for statistical estimation tasks under the constraint of p ε, δ q differential privacy are proved and a tight Ω ` d α 2 ε ˘ lower bound for estimating the mean of a distribution with bounded covariance to α -error in ℓ 2 -distance is shown.

FriendlyCore: Practical Differentially Private Aggregation

Surprisingly, FriendlyCore is light-weight with no dependence on the dimension, and empirically demonstrate its advantages in boosting the accuracy of mean estimation and clustering tasks such as k -means and k -GMM, outperforming tailored methods.

A Private and Computationally-Efficient Estimator for Unbounded Gaussians

The primary new technical tool in the algorithm is a new differentially private preconditioner that takes samples from an arbitrary Gaussian N and returns a matrix A such that A Σ A T has constant condition number.



Instance-optimality in differential privacy via approximate inverse sensitivity mechanisms

We study and provide instance-optimal algorithms in differential privacy by ex-tending and approximating the inverse sensitivity mechanism. We provide two approximation frameworks, one which only

Learning with User-Level Privacy

User-level DP protects a user’s entire contribution, providing more stringent but more realistic protection against information leaks, and shows that for high-dimensional mean estimation, empirical risk minimization with smooth losses, stochastic convex optimization, and learning hypothesis class with finite metric entropy, the privacy cost decreases as O(1/ m) as users provide more samples.

A ug 2 01 4 Local Privacy , Data Processing Inequalities , and Minimax Rates

This work proves bounds on information-theoretic quantities, including mutual information and Kullback-Leibler divergence, that depend on the privacy guarantees, and provides a treatment of several canonical families of problems: mean estimation, parameter estimation in fixed-design regression, multinomial probability estimation, and nonparametric density estimation.

CoinPress: Practical Private Mean and Covariance Estimation

This work presents simple differentially private estimators for the mean and covariance of multivariate sub-Gaussian data that are accurate at small sample sizes and shows that their asymptotic error rates match the state-of-the-art theoretical bounds.

Bounding User Contributions: A Bias-Variance Trade-off in Differential Privacy

It is shown that in general there is a “sweet spot” that depends on measurable properties of the dataset, but that there is also a concrete cost to privacy that cannot be avoided simply by collecting more data.

Lower Bounds for Locally Private Estimation via Communication Complexity

Lower bounds for estimation under local privacy constraints are developed by showing an equivalence between private estimation and communication-restricted estimation problems, and it is shown that the minimax mean-squared error for estimating the mean of a bounded or Gaussian random vector in $d$ dimensions scales as $\frac{d}{n} \cdot \frac{ d}{ \min\{\varepsilon, \varePSilon^2\}}$.

Finite Sample Differentially Private Confidence Intervals

These algorithms guarantee a finite sample coverage, as opposed to an asymptotic coverage, and prove lower bounds on the expected size of any differentially private confidence set showing that the parameters are optimal up to polylogarithmic factors.

Locally Private Mean Estimation: Z-test and Tight Confidence Intervals

This work provides tight upper- and lower-bounds for the problem of mean estimation under differential privacy in the local-model, when the input is composed of n i.i.d. drawn samples from a

Answering Range Queries Under Local Differential Privacy

This work studies the problem of answering 1-dimensional range count queries under the constraint of LDP, a framework of differential privacy for privacy-preserving data analysis.

Differentially Private Learning with Adaptive Clipping

It is shown that adaptively setting the clipping norm applied to each user's update, based on a differentially private estimate of a target quantile of the distribution of unclipped norms, is sufficient to remove the need for such extensive parameter tuning.