Corpus ID: 235795273

Unified lower bounds for interactive high-dimensional estimation under information constraints

@article{Acharya2020UnifiedLB,
  title={Unified lower bounds for interactive high-dimensional estimation under information constraints},
  author={Jayadev Acharya and Cl{\'e}ment L. Canonne and Ziteng Sun and Himanshu Tyagi},
  journal={ArXiv},
  year={2020},
  volume={abs/2010.06562}
}
We consider the task of distributed parameter estimation using interactive protocols subject to local information constraints such as bandwidth limitations, local differential privacy, and restricted measurements. We provide a unified framework enabling us to derive a variety of (tight) minimax lower bounds for different parametric families of distributions, both continuous and discrete, under any lp loss. Our lower bound framework is versatile and yields “plug-and-play” bounds that are widely… Expand

Figures from this paper

Fundamental limits of over-the-air optimization: Are analog schemes optimal?
TLDR
Over-the-air convex optimization on a d−dimensional space where coded gradients are sent over an additive Gaussian noise channel with variance σ is considered, showing that any code must slowdown the convergence rate by a factor of roughly √ d/ log(1 + SNR). Expand
Optimal Rates for Nonparametric Density Estimation under Communication Constraints
TLDR
A noninteractive adaptive estimator that exploits the sparsity of wavelet bases, along with a simulate-and-infer technique from parametric estimation under communication constraints that is nearly rate-optimal for Besov spaces and Sobolev spaces. Expand
Uniformity Testing in the Shuffle Model: Simpler, Better, Faster
TLDR
This work considerably simplify the analysis of the known uniformity testing algorithm in the shuffle model, and provides an alternative algorithm attaining the same guarantees with an elementary and streamlined argument. Expand
Geometric Lower Bounds for Distributed Parameter Estimation under Communication Constraints
TLDR
This work circumvents the need for strong data processing inequalities used in prior work and develops a geometric approach which builds on a new representation of the communication constraint which allows to strengthen and generalize existing results with simpler and more transparent proofs. Expand

References

SHOWING 1-10 OF 37 REFERENCES
Interactive Inference under Information Constraints
TLDR
The main technical contribution is an approach to handle the correlation that builds due to interactivity and quantifies how effectively one can exploit this correlation in spite of the local constraints. Expand
Inference Under Information Constraints III: Local Privacy Constraints
TLDR
It is shown that the availability of shared (public) randomness greatly reduces the sample complexity and under the notion of local differential privacy, simple, sample-optimal, and communication-efficient protocols are proposed for these two questions in the noninteractive setting. Expand
On Learning Parametric Distributions from Quantized Samples
  • Septimia Sârbu, A. Zaidi
  • Computer Science, Mathematics
  • 2021 IEEE International Symposium on Information Theory (ISIT)
  • 2021
TLDR
First, a generalization of the well-known van Trees inequality to general Lp-norms is established, with p > 1, in terms of Generalized Fisher information, and minimax lower bounds on the estimation error for two losses are developed. Expand
Distributed Signal Detection under Communication Constraints
TLDR
Lower bounds for protocols with public randomness are obtained, which are tight when $\ell=O(1)$. Expand
Fisher Information Under Local Differential Privacy
TLDR
Data processing inequalities that describe how Fisher information from statistical samples can scale with the privacy parameter under local differential privacy constraints imply order-optimal lower bounds for private estimation for both the Gaussian location model and discrete distribution estimation for all levels of privacy. Expand
Inference Under Information Constraints I: Lower Bounds From Chi-Square Contraction
TLDR
Lower bounds for the sample complexity of learning and testing discrete distributions in this information-constrained setting are derived from a characterization of the contraction in chi-square distance between the observed distributions of the samples when information constraints are placed. Expand
Lecture notes on: Information-theoretic methods for high-dimensional statistics
  • 2020
Private Identity Testing for High-Dimensional Distributions
TLDR
The authors' testers have improved sample complexity compared to those derived from previous techniques, and are the first testers whose sample complexity matches the order-optimal minimax sample complexity of $O(d^{1/2}/\alpha^2)$ in many parameter regimes. Expand
Fisher Information for Distributed Estimation under a Blackboard Communication Protocol
We consider the problem of learning high-dimensional discrete distributions and structured (e.g. Gaussian) distributions in distributed networks, where each node in the network observes anExpand
Hadamard Response: Estimating Distributions Privately, Efficiently, and with Little Communication
TLDR
Hadamard Response (HR) is proposed, a local privatization scheme that requires no shared randomness and is symmetric with respect to the users, and which runs about 100x faster than Randomized Response, RAPPOR, and subset-selection mechanisms. Expand
...
1
2
3
4
...