Information-Theoretic Determination of Minimax Rates of Convergence

@inproceedings{Yang1999InformationTheoreticDO,
  title={Information-Theoretic Determination of Minimax Rates of Convergence},
  author={Yuhong Yang and Andrew R. Barron},
  year={1999}
}
We present some general results determining minimax bounds on statistical risk for density estimation based on certain information-theoretic considerations. These bounds depend only on metric entropy conditions and are used to identify the minimax rates of convergence. 

Citations

Publications citing this paper.
SHOWING 1-10 OF 273 CITATIONS

Orthogonal Statistical Learning

VIEW 10 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Approximation Methods for Supervised Learning

  • Foundations of Computational Mathematics
  • 2006
VIEW 5 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Minimax Lower Bounds Minimax Lower Bounds

VIEW 8 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

1993
2019

CITATION STATISTICS

  • 60 Highly Influenced Citations

  • Averaged 23 Citations per year from 2017 through 2019

References

Publications referenced by this paper.
SHOWING 1-10 OF 78 REFERENCES

Information Theory, Wiley Interscience, New York

R. Ash
  • 1965
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Complexity-based model selection," prospectus submitted to Department of Statistics, Yale University

Y. Yang
  • 1993
VIEW 1 EXCERPT
HIGHLY INFLUENTIAL

Neural net approximation,

A. R. Barron
  • Proc. Yale Workshop Adaptive Learn- ing Syst.,
  • 1991
VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL