ENTROPY AND THE CENTRAL LIMIT THEOREM

@article{Barron1986ENTROPYAT,
  title={ENTROPY AND THE CENTRAL LIMIT THEOREM},
  author={Andrew R. Barron},
  journal={Annals of Probability},
  year={1986},
  volume={14},
  pages={336-342}
}
  • A. Barron
  • Published 1986
  • Mathematics
  • Annals of Probability
On etend un argument de Brown (1982) pour montrer que les informations de Fisher convergent vers la reciproque de la variance 

On the rate of convergence in the entropic central limit theorem

Abstract.We study the rate at which entropy is produced by linear combinations of independent random variables which satisfy a spectral gap condition.

Berry–Esseen bounds in the entropic central limit theorem

Berry–Esseen-type bounds for total variation and relative entropy distances to the normal law are established for the sums of non-i.i.d. random variables.

Asymptotic Behavior of Rényi Entropy in the Central Limit Theorem

We explore an asymptotic behavior of Renyi entropy along convolutions in the central limit theorem with respect to the increasing number of i.i.d. summands. In particular, the problem of monotonicity

Entropy Power Inequality for the Rényi Entropy

The classical entropy power inequality is extended to the Rényi entropy. We also discuss the question of the existence of the entropy for sums of independent random variables.

Asymptotic behavior of R\'enyi entropy in the central limit theorem

We explore an asymptotic behavior of Rényi entropy along convolutions in the central limit theorem with respect to the increasing number of i.i.d. summands. In particular, the problem of monotonicity

Entropy and the Discrete Central Limit Theorem

A strengthened version of the central limit theorem for discrete random variables is established, relying only on information-theoretic tools and elementary arguments. It is shown that the relative

The Monotonicity of Information in the Central Limit Theorem and Entropy Power Inequalities

TLDR
A simple proof of the monotonicity of information in the central limit theorem for i.i.d. summands is provided and new families of Fisher information and entropy power inequalities are discussed.

Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem

An Edgeworth-type expansion is established for the entropy distance to the class of normal distributions of sums of i.i.d. random variables or vectors, satisfying minimal moment conditions.

Fisher information and convergence to stable laws

The convergence to stable laws is studied in relative Fisher information for sums of i.i.d. random variables.

Fisher information inequalities and the central limit theorem

Abstract.We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L2 spaces and Poincaré
...

References

SHOWING 1-10 OF 12 REFERENCES

An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions

The proof is based on the “information functional” $l(\int_{ - \infty }^{ + \infty } {p(x)\log p(x)dx} + \tfrac{1} {2}\log {\bf D}(x))$ , $p(x)$ being the density of the random variable X. Some new

The convolution inequality for entropy powers

TLDR
An improved version of Stam's proof of Shannon's convolution inequality for entropy power is presented, which is obtained by mathematical induction from the one-dimensional case.

Probability and Measure

Probability. Measure. Integration. Random Variables and Expected Values. Convergence of Distributions. Derivatives and Conditional Probability. Stochastic Processes. Appendix. Notes on the Problems.

Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon

Information Theory and Reliable Communication

TLDR
This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.

$I$-Divergence Geometry of Probability Distributions and Minimization Problems

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and

Monotonic central limit theorem

  • 1984

Probability Theory