Finite-Sample Concentration of the Multinomial in Relative Entropy.

@article{Agrawal2020FiniteSampleCO,
  title={Finite-Sample Concentration of the Multinomial in Relative Entropy.},
  author={Rohit Agrawal},
  journal={arXiv: Information Theory},
  year={2020}
}
  • Rohit Agrawal
  • Published 2020
  • Mathematics, Computer Science
  • arXiv: Information Theory
  • We show that the moment generating function of the Kullback-Leibler divergence (relative entropy) between the empirical distribution of $n$ independent samples from a distribution $P$ over a finite alphabet of size $k$ (e.g. a multinomial distribution) and $P$ itself is no more than that of a gamma distribution with shape $k - 1$ and rate $n$. The resulting exponential concentration inequality becomes meaningful (less than 1) when the divergence $\varepsilon$ is larger than $(k-1)/n$, whereas… CONTINUE READING

    Tables and Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 19 REFERENCES

    Maximum Likelihood Estimation of Functionals of Discrete Distributions

    VIEW 2 EXCERPTS

    On the Robustness of Information-Theoretic Privacy Measures and Mechanisms

    VIEW 1 EXCERPT

    Information Theory and Statistics: A Tutorial

    Elements of Information Theory

    VIEW 1 EXCERPT

    Bounds on tail probabilities for negative binomial distributions

    VIEW 1 EXCERPT