Corpus ID: 235694600

Distributed Nonparametric Function Estimation: Optimal Rate of Convergence and Cost of Adaptation

@article{Cai2021DistributedNF,
  title={Distributed Nonparametric Function Estimation: Optimal Rate of Convergence and Cost of Adaptation},
  author={T. Tony Cai and Hongji Wei},
  journal={ArXiv},
  year={2021},
  volume={abs/2107.00179}
}
  • T. Cai, Hongji Wei
  • Published 2021
  • Mathematics, Computer Science
  • ArXiv
Distributed minimax estimation and distributed adaptive estimation under communication constraints for Gaussian sequence model and white noise model are studied. The minimax rate of convergence for distributed estimation over a given Besov class, which serves as a benchmark for the cost of adaptation, is established. We then quantify the exact communication cost for adaptation and construct an optimally adaptive procedure for distributed estimation over a range of Besov classes. The results… Expand

Figures from this paper

References

SHOWING 1-10 OF 44 REFERENCES
Distributed function estimation: adaptation using minimal communication
We investigate whether in a distributed setting, adaptive estimation of a smooth function at the optimal rate is possible under minimal communication. It turns out that the answer depends on the riskExpand
Distributed Gaussian Mean Estimation under Communication Constraints: Optimal Rates and Communication-Efficient Algorithms
TLDR
Although optimal estimation of a Gaussian mean is relatively simple in the conventional setting, it is quite involved under the communication constraints, both in terms of the optimal procedure design and lower bound argument. Expand
Distributed Nonparametric Regression under Communication Constraints
This paper studies the problem of nonparametric estimation of a smooth function with data distributed across multiple machines. We assume an independent sample from a white noise model is collectedExpand
Geometric Lower Bounds for Distributed Parameter Estimation under Communication Constraints
TLDR
This work circumvents the need for strong data processing inequalities used in prior work and develops a geometric approach which builds on a new representation of the communication constraint which allows to strengthen and generalize existing results with simpler and more transparent proofs. Expand
Asymptotic Equivalence of Density Estimation and Gaussian White Noise
Signal recovery in Gaussian white noise with variance tending to zero has served for some time as a representative model for nonparametric curve estimation, having all the essential traits in a pureExpand
Adaptive wavelet estimation : A block thresholding and oracle inequality approach
We study wavelet function estimation via the approach of block thresh- olding and ideal adaptation with oracle. Oracle inequalities are derived and serve as guides for the selection of smoothingExpand
Communication lower bounds for statistical estimation problems via a distributed data processing inequality
TLDR
A distributed data processing inequality is proved, as a generalization of usual data processing inequalities, which might be of independent interest and useful for other problems. Expand
On Communication Cost of Distributed Statistical Estimation and Dimensionality
TLDR
It is conjecture that the tradeoff between communication and squared loss demonstrated by this protocol is essentially optimal up to logarithmic factor, and the strong lower bounds in the general setting are initiated. Expand
Communication-Efficient Distributed Learning of Discrete Distributions
TLDR
This work designs distributed learning algorithms that achieve significantly better communication guarantees than the naive ones, and obtains tight upper and lower bounds in several regimes of distribution learning. Expand
DISTRIBUTED TESTING AND ESTIMATION UNDER SPARSE HIGH DIMENSIONAL MODELS.
TLDR
This paper addresses the important question of how large k can be, as n grows large, such that the loss of efficiency due to the divide-and-conquer algorithm is negligible. Expand
...
1
2
3
4
5
...