# Geometric Lower Bounds for Distributed Parameter Estimation Under Communication Constraints

@article{Han2021GeometricLB, title={Geometric Lower Bounds for Distributed Parameter Estimation Under Communication Constraints}, author={Yanjun Han and Ayfer {\"O}zg{\"u}r and Tsachy Weissman}, journal={IEEE Transactions on Information Theory}, year={2021}, volume={67}, pages={8248-8263} }

We consider parameter estimation in distributed networks, where each sensor in the network observes an independent sample from an underlying distribution and has <inline-formula> <tex-math notation="LaTeX">$k$ </tex-math></inline-formula> bits to communicate its sample to a centralized processor which computes an estimate of a desired parameter. We develop lower bounds for the minimax risk of estimating the underlying parameter for a large class of losses and distributions. Our results show…

## Topics from this paper

## 61 Citations

Lower Bounds for Learning Distributions under Communication Constraints via Fisher Information

- Computer Science
- 2019

We consider the problem of learning high-dimensional, nonparametric and structured (e.g. Gaussian) distributions in distributed networks, where each node in the network observes an independent sample…

Pointwise Bounds for Distribution Estimation under Communication Constraints

- Computer Science, MathematicsArXiv
- 2021

It is shown that the `2 error decays with O ⇣ max ⇣kpk1/2 n2b , 1 n ⌘⌘ when n is sufficiently large, hence it is governed by the half-norm of p instead of the ambient dimension d, and the correct measure of the local communication complexity at p is its Rényi entropy.

Learning Distributions from their Samples under Communication Constraints

- Computer Science, MathematicsArXiv
- 2019

We consider the problem of learning high-dimensional, nonparametric and structured (e.g. Gaussian) distributions in distributed networks, where each node in the network observes an independent sample…

Distributed Statistical Estimation of High-Dimensional and Nonparametric Distributions

- Computer Science2018 IEEE International Symposium on Information Theory (ISIT)
- 2018

This work builds on a new representation of the communication constraint of the distribution constraint, which leads to a tight characterization of the problem of estimating high-dimensional and nonparametric distributions in distributed networks.

Breaking The Dimension Dependence in Sparse Distribution Estimation under Communication Constraints

- Computer Science, MathematicsCOLT
- 2021

Surprisingly, it is shown that when sample size n exceeds a minimum threshold n∗(s, d, b), the authors can achieve an l2 estimation error of O ( s n2 ) , which implies that when n > n ∼ n ∗ (s,d, b) the convergence rate does not depend on the ambient dimension d and is the same as knowing the support of the distribution beforehand.

General lower bounds for interactive high-dimensional estimation under information constraints.

- Computer Science
- 2020

This work provides a general framework enabling us to derive a variety of (tight) minimax lower bounds under different parametric families of distributions, both continuous and discrete, under any $\ell_p$ loss.

Communication Complexity in Locally Private Distribution Estimation and Heavy Hitters

- Computer Science, MathematicsICML
- 2019

This work proposes a sample-optimal $\varepsilon$-locally differentially private (LDP) scheme for distribution estimation, where each user communicates only one bit, and requires no public randomness.

Lower Bounds for Locally Private Estimation via Communication Complexity

- Mathematics, Computer ScienceCOLT
- 2019

Lower bounds for estimation under local privacy constraints are developed by showing an equivalence between private estimation and communication-restricted estimation problems, and it is shown that the minimax mean-squared error for estimating the mean of a bounded or Gaussian random vector in $d$ dimensions scales as $\frac{d}{n} \cdot \frac{ d}{ \min\{\varepsilon, \varePSilon^2\}}$.

Distributed Gaussian Mean Estimation under Communication Constraints: Optimal Rates and Communication-Efficient Algorithms

- Mathematics, Computer ScienceArXiv
- 2020

Although optimal estimation of a Gaussian mean is relatively simple in the conventional setting, it is quite involved under the communication constraints, both in terms of the optimal procedure design and lower bound argument.

Distributed Signal Detection under Communication Constraints

- Computer ScienceCOLT
- 2020

Lower bounds for protocols with public randomness are obtained, which are tight when $\ell=O(1)$.

## References

SHOWING 1-10 OF 68 REFERENCES

Fisher Information Under Local Differential Privacy

- Computer Science, MathematicsIEEE Journal on Selected Areas in Information Theory
- 2020

Data processing inequalities that describe how Fisher information from statistical samples can scale with the privacy parameter under local differential privacy constraints imply order-optimal lower bounds for private estimation for both the Gaussian location model and discrete distribution estimation for all levels of privacy.

Lower Bounds for Learning Distributions under Communication Constraints via Fisher Information

- Computer Science
- 2019

We consider the problem of learning high-dimensional, nonparametric and structured (e.g. Gaussian) distributions in distributed networks, where each node in the network observes an independent sample…

Communication lower bounds for statistical estimation problems via a distributed data processing inequality

- Computer Science, MathematicsSTOC
- 2016

A distributed data processing inequality is proved, as a generalization of usual data processing inequalities, which might be of independent interest and useful for other problems.

Distributed Statistical Estimation of High-Dimensional and Nonparametric Distributions

- Computer Science2018 IEEE International Symposium on Information Theory (ISIT)
- 2018

This work builds on a new representation of the communication constraint of the distribution constraint, which leads to a tight characterization of the problem of estimating high-dimensional and nonparametric distributions in distributed networks.

Breaking The Dimension Dependence in Sparse Distribution Estimation under Communication Constraints

- Computer Science, MathematicsCOLT
- 2021

Surprisingly, it is shown that when sample size n exceeds a minimum threshold n∗(s, d, b), the authors can achieve an l2 estimation error of O ( s n2 ) , which implies that when n > n ∼ n ∗ (s,d, b) the convergence rate does not depend on the ambient dimension d and is the same as knowing the support of the distribution beforehand.

Information-theoretic lower bounds for distributed statistical estimation with communication constraints

- Computer Science, MathematicsNIPS
- 2013

Lower bounds on minimax risks for distributed statistical estimation under a communication budget are established for several problems, including various types of location models, as well as for parameter estimation in regression models.

Communication-Efficient Distributed Learning of Discrete Distributions

- Computer ScienceNIPS
- 2017

This work designs distributed learning algorithms that achieve significantly better communication guarantees than the naive ones, and obtains tight upper and lower bounds in several regimes of distribution learning.

Communication Complexity in Locally Private Distribution Estimation and Heavy Hitters

- Computer Science, MathematicsICML
- 2019

This work proposes a sample-optimal $\varepsilon$-locally differentially private (LDP) scheme for distribution estimation, where each user communicates only one bit, and requires no public randomness.

Lower Bounds for Locally Private Estimation via Communication Complexity

- Mathematics, Computer ScienceCOLT
- 2019

Lower bounds for estimation under local privacy constraints are developed by showing an equivalence between private estimation and communication-restricted estimation problems, and it is shown that the minimax mean-squared error for estimating the mean of a bounded or Gaussian random vector in $d$ dimensions scales as $\frac{d}{n} \cdot \frac{ d}{ \min\{\varepsilon, \varePSilon^2\}}$.

On Communication Cost of Distributed Statistical Estimation and Dimensionality

- Computer Science, MathematicsNIPS
- 2014

It is conjecture that the tradeoff between communication and squared loss demonstrated by this protocol is essentially optimal up to logarithmic factor, and the strong lower bounds in the general setting are initiated.