Corpus ID: 235795273

Unified lower bounds for interactive high-dimensional estimation under information constraints

@article{Acharya2020UnifiedLB,
  title={Unified lower bounds for interactive high-dimensional estimation under information constraints},
  author={Jayadev Acharya and Cl{\'e}ment L. Canonne and Ziteng Sun and Himanshu Tyagi},
  journal={ArXiv},
  year={2020},
  volume={abs/2010.06562}
}
We consider the task of distributed parameter estimation using interactive protocols subject to local information constraints such as bandwidth limitations, local differential privacy, and restricted measurements. We provide a unified framework enabling us to derive a variety of (tight) minimax lower bounds for different parametric families of distributions, both continuous and discrete, under any lp loss. Our lower bound framework is versatile and yields “plug-and-play” bounds that are widely… 

Figures from this paper

COLT2021 Tutorial – Recitation
This document contains: (1) a short prelude, to check that some of the bounds obtained or claimed during the lecture part of the tutorial make sense; (2) a two-parter online component, where you will
Fundamental limits of over-the-air optimization: Are analog schemes optimal?
TLDR
Over-the-air convex optimization on a d−dimensional space where coded gradients are sent over an additive Gaussian noise channel with variance σ is considered, showing that any code must slowdown the convergence rate by a factor of roughly √ d/ log(1 + SNR).
Geometric Lower Bounds for Distributed Parameter Estimation Under Communication Constraints
TLDR
This work circumvents the need for strong data processing inequalities used in prior work and develops a geometric approach which builds on a new representation of the communication constraint which allows it to strengthen and generalize existing results with simpler and more transparent proofs.
Optimal Rates for Nonparametric Density Estimation under Communication Constraints
TLDR
A noninteractive adaptive estimator that exploits the sparsity of wavelet bases, along with a simulate-and-infer technique from parametric estimation under communication constraints that is nearly rate-optimal for Besov spaces and Sobolev spaces.
Uniformity Testing in the Shuffle Model: Simpler, Better, Faster
TLDR
This work considerably simplify the analysis of the known uniformity testing algorithm in the shuffle model, and provides an alternative algorithm attaining the same guarantees with an elementary and streamlined argument.

References

SHOWING 1-10 OF 37 REFERENCES
Interactive Inference under Information Constraints
TLDR
The main technical contribution is an approach to handle the correlation that builds due to interactivity and quantifies how effectively one can exploit this correlation in spite of the local constraints.
Geometric Lower Bounds for Distributed Parameter Estimation Under Communication Constraints
TLDR
This work circumvents the need for strong data processing inequalities used in prior work and develops a geometric approach which builds on a new representation of the communication constraint which allows it to strengthen and generalize existing results with simpler and more transparent proofs.
Inference Under Information Constraints III: Local Privacy Constraints
TLDR
It is shown that the availability of shared (public) randomness greatly reduces the sample complexity and under the notion of local differential privacy, simple, sample-optimal, and communication-efficient protocols are proposed for these two questions in the noninteractive setting.
On Learning Parametric Distributions from Quantized Samples
  • Septimia Sârbu, A. Zaidi
  • Computer Science, Mathematics
    2021 IEEE International Symposium on Information Theory (ISIT)
  • 2021
TLDR
First, a generalization of the well-known van Trees inequality to general Lp-norms is established, with p > 1, in terms of Generalized Fisher information, and minimax lower bounds on the estimation error for two losses are developed.
Distributed Signal Detection under Communication Constraints
TLDR
Lower bounds for protocols with public randomness are obtained, which are tight when $\ell=O(1)$.
Fisher Information Under Local Differential Privacy
TLDR
Data processing inequalities that describe how Fisher information from statistical samples can scale with the privacy parameter under local differential privacy constraints imply order-optimal lower bounds for private estimation for both the Gaussian location model and discrete distribution estimation for all levels of privacy.
Inference Under Information Constraints I: Lower Bounds From Chi-Square Contraction
TLDR
Lower bounds for the sample complexity of learning and testing discrete distributions in this information-constrained setting are derived from a characterization of the contraction in chi-square distance between the observed distributions of the samples when information constraints are placed.
Lecture notes on: Information-theoretic methods for high-dimensional statistics
  • 2020
Private Identity Testing for High-Dimensional Distributions
TLDR
The authors' testers have improved sample complexity compared to those derived from previous techniques, and are the first testers whose sample complexity matches the order-optimal minimax sample complexity of $O(d^{1/2}/\alpha^2)$ in many parameter regimes.
Fisher Information for Distributed Estimation under a Blackboard Communication Protocol
We consider the problem of learning high-dimensional discrete distributions and structured (e.g. Gaussian) distributions in distributed networks, where each node in the network observes an
...
1
2
3
4
...