Corpus ID: 49272478

The Right Complexity Measure in Locally Private Estimation: It is not the Fisher Information

@article{Duchi2018TheRC,
  title={The Right Complexity Measure in Locally Private Estimation: It is not the Fisher Information},
  author={John C. Duchi and Feng Ruan},
  journal={ArXiv},
  year={2018},
  volume={abs/1806.05756}
}
We identify fundamental tradeoffs between statistical utility and privacy under local models of privacy in which data is kept private even from the statistician, providing instance-specific bounds for private estimation and learning problems by developing the \emph{local minimax risk}. In contrast to approaches based on worst-case (minimax) error, which are conservative, this allows us to evaluate the difficulty of individual problem instances and delineate the possibilities for adaptation in… Expand
Lower Bounds for Locally Private Estimation via Communication Complexity
TLDR
Lower bounds for estimation under local privacy constraints are developed by showing an equivalence between private estimation and communication-restricted estimation problems, and it is shown that the minimax mean-squared error for estimating the mean of a bounded or Gaussian random vector in $d$ dimensions scales as $\frac{d}{n} \cdot \frac{ d}{ \min\{\varepsilon, \varePSilon^2\}}$. Expand
High Dimensional Sparse Linear Regression under Local Differential Privacy: Power and Limitations
In this paper, we study high dimensional sparse linear regression under the Local Differential Privacy (LDP) model, and give both negative and positive results. On the negative side, we show thatExpand
Near Instance-Optimality in Differential Privacy
TLDR
Two notions of instance optimality in differential privacy are developed by defining a local minimax risk and the other by considering unbiased mechanisms and analogizing the Cramer-Rao bound, and it is shown that the local modulus of continuity of the estimand of interest completely determines these quantities. Expand
The power of factorization mechanisms in local and central differential privacy
TLDR
New characterizations of the sample complexity of answering linear queries in the local and central models of differential privacy show that a particular factorization mechanism is approximately optimal, and the optimal sample complexity is bounded from above and below by well studied factorization norms of a matrix associated with the queries. Expand
Generalized Linear Models in Non-interactive Local Differential Privacy with Public Data
In this paper, we study the problem of estimating smooth Generalized Linear Models (GLM) in the Non-interactive Local Differential Privacy (NLDP) model. Different from its classical setting, ourExpand
The structure of optimal private tests for simple hypotheses
Hypothesis testing plays a central role in statistical inference, and is used in many settings where privacy concerns are paramount. This work answers a basic question about privately testing simpleExpand
On Sparse Linear Regression in the Local Differential Privacy Model
TLDR
This paper shows that polynomial dependency on the dimensionality of the space is unavoidable for the estimation error in both non-interactive and sequential interactive local models, and shows that differential privacy in high dimensional space is unlikely achievable for the problem. Expand
Tight lower bound of sparse covariance matrix estimation in the local differential privacy model
TLDR
This paper gives a lower bound of Ω ( s 2 log ⁡ p n ϵ 2 ) on the ϵ non-interactive private minimax risk in the metric of squared spectral norm, where s is the row sparsity of the underlying covariance matrix, n is the sample size, and p is the dimensionality of the data. Expand
Gaussian Differential Privacy
TLDR
A new relaxation of privacy is proposed, which has a number of appealing properties and, in particular, avoids difficulties associated with divergence based relaxations, and is introduced as `Gaussian differential privacy' (GDP), defined based on testing two shifted Gaussians. Expand
The Cost of Privacy in Generalized Linear Models: Algorithms and Minimax Lower Bounds
TLDR
The proposed algorithms for parameter estimation in both low-dimensional and high-dimensional sparse generalized linear models (GLMs) are shown to be nearly rate-optimal by characterizing their statistical performance and establishing privacy-constrained minimax lower bounds for GLMs. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 74 REFERENCES
Minimax Optimal Procedures for Locally Private Estimation
TLDR
Private versions of classical information-theoretical bounds, in particular those due to Le Cam, Fano, and Assouad, are developed to allow for a precise characterization of statistical rates under local privacy constraints and the development of provably (minimax) optimal estimation procedures. Expand
Lower Bounds for Locally Private Estimation via Communication Complexity
TLDR
Lower bounds for estimation under local privacy constraints are developed by showing an equivalence between private estimation and communication-restricted estimation problems, and it is shown that the minimax mean-squared error for estimating the mean of a bounded or Gaussian random vector in $d$ dimensions scales as $\frac{d}{n} \cdot \frac{ d}{ \min\{\varepsilon, \varePSilon^2\}}$. Expand
Local privacy and statistical minimax rates
TLDR
Borders on information-theoretic quantities that influence estimation rates as a function of the amount of privacy preserved can be viewed as quantitative data-processing inequalities that allow for precise characterization of statistical rates under local privacy constraints. Expand
Calibrating Noise to Sensitivity in Private Data Analysis
TLDR
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output. Expand
On the geometry of differential privacy
TLDR
The lower bound is strong enough to separate the concept of differential privacy from the notion of approximate differential privacy where an upper bound of O(√{d}/ε) can be achieved. Expand
Amplification by Shuffling: From Local to Central Differential Privacy via Anonymity
TLDR
It is shown, via a new and general privacy amplification technique, that any permutation-invariant algorithm satisfying e-local differential privacy will satisfy [MATH HERE]-central differential privacy. Expand
The Algorithmic Foundations of Differential Privacy
TLDR
The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example. Expand
A Statistical Framework for Differential Privacy
One goal of statistical privacy research is to construct a data release mechanism that protects individual privacy while preserving information content. An example is a random mechanism that takes anExpand
Local Minimax Complexity of Stochastic Convex Optimization
TLDR
This work shows how the computational modulus of continuity can be explicitly calculated in concrete cases, and relates to the curvature of the function at the optimum, and proves a superefficiency result that demonstrates it is a meaningful benchmark, acting as a computational analogue of the Fisher information in statistical estimation. Expand
Geometrizing rates of convergence under differential privacy constraints
We study estimation of a functional $\theta(\mathbb P)$ of an unknown probability distribution $\mathbb P \in\mathcal P$ in which the original iid sample $X_1,\dots, X_n$ is kept private even fromExpand
...
1
2
3
4
5
...