• Corpus ID: 235489722

Non-parametric Differentially Private Confidence Intervals for the Median

@article{Drechsler2021NonparametricDP,
  title={Non-parametric Differentially Private Confidence Intervals for the Median},
  author={Joerg Drechsler and Ira Globus-Harris and Audra McMillan and Jayshree Sarathy and Adam D. Smith},
  journal={ArXiv},
  year={2021},
  volume={abs/2106.10333}
}
Differential privacy is a restriction on data processing algorithms that provides strong confidentiality guarantees for individual records in the data. However, research on proper statistical inference, that is, research on properly quantifying the uncertainty of the (noisy) sample estimate regarding the true value in the population, is currently still limited. This paper proposes and evaluates several strategies to compute valid differentially private confidence intervals for the median… 

Unbiased Statistical Estimation and Valid Confidence Intervals Under Differential Privacy

TLDR
If the user can bound the parameters of the BLB-induced parameters and provide heavier-tailed families, the algorithm produces unbiased parameter estimates and valid confidence intervals which hold with arbitrarily high probability.

Locally private nonparametric confidence intervals and sequences

TLDR
A nonparametric and sequentially interactive generalization of Warner’s famous “randomized response” mechanism is introduced, satisfying LDP for arbitrary bounded random variables, and CIs and CSs are provided for their means given access to the resulting privatized observations.

Analyzing the Differentially Private Theil-Sen Estimator for Simple Linear Regression

In this paper, we study differentially private point and confidence interval estimators for simple linear regression. Motivated by recent work that highlights the strong empirical performance of an

References

SHOWING 1-10 OF 53 REFERENCES

General-Purpose Differentially-Private Confidence Intervals

TLDR
This work develops two broadly applicable methods for private confidence-interval construction based on asymptotics and the parametric bootstrap, which applies "out of the box" to a wide class of private estimators and has good coverage at small sample sizes, but with increased computational cost.

Bootstrap Inference and Differential Privacy: Standard Errors for Free∗

TLDR
For a broad class of functions under zero concentrated differential privacy, the bootstrap can be implemented at no cost for a given choice of privacy parameter and associated expected error of some query.

Privacy-preserving statistical estimation with optimal convergence rates

TLDR
It is shown that for a large class of statistical estimators T and input distributions P, there is a differentially private estimator AT with the same asymptotic distribution as T, which implies that AT (X) is essentially as good as the original statistic T(X) for statistical inference, for sufficiently large samples.

Exact Inference with Approximate Computation for Differentially Private Data via Perturbations

TLDR
This paper shows that approximate Bayesian computation, a practical suite of methods to simulate from approximate posterior distributions of complex Bayesian models, produces exact posterior samples when applied to differentially private perturbation data.

On the Theory and Practice of Privacy-Preserving Bayesian Data Analysis

TLDR
A simple alternative based on the Laplace mechanism, the workhorse of differential privacy, is as asymptotically efficient as non-private posterior inference, under general assumptions, and has practical advantages including efficient use of the privacy budget for MCMC.

Privacy for Free: Posterior Sampling and Stochastic Gradient Monte Carlo

TLDR
It is shown that under standard assumptions, getting one sample from a posterior distribution is differentially private "for free"; and this sample as a statistical estimator is often consistent, near optimal, and computationally tractable; and this observations lead to an "anytime" algorithm for Bayesian learning under privacy constraint.

Statistically Valid Inferences from Privacy Protected Data

TLDR
This work builds on the standard of “differential privacy,” correct for biases induced by the privacy-preserving procedures, provide a proper accounting of uncertainty, and impose minimal constraints on the choice of statistical methods and quantities estimated.

Finite Sample Differentially Private Confidence Intervals

TLDR
These algorithms guarantee a finite sample coverage, as opposed to an asymptotic coverage, and prove lower bounds on the expected size of any differentially private confidence set showing that the parameters are optimal up to polylogarithmic factors.

Locally Private Mean Estimation: Z-test and Tight Confidence Intervals

This work provides tight upper- and lower-bounds for the problem of mean estimation under differential privacy in the local-model, when the input is composed of n i.i.d. drawn samples from a

Differentially Private Hypothesis Testing, Revisited

TLDR
A permutation-based testbed is proposed that can allow experimenters to empirically estimate the behavior of new test statistics for private hypothesis testing before fully working out their mathematical details (such as approximate null distributions).
...