# ENTROPY AND THE CENTRAL LIMIT THEOREM

@article{Barron1986ENTROPYAT, title={ENTROPY AND THE CENTRAL LIMIT THEOREM}, author={Andrew R. Barron}, journal={Annals of Probability}, year={1986}, volume={14}, pages={336-342} }

On etend un argument de Brown (1982) pour montrer que les informations de Fisher convergent vers la reciproque de la variance

## 361 Citations

### On the rate of convergence in the entropic central limit theorem

- Mathematics
- 2004

Abstract.We study the rate at which entropy is produced by linear combinations of independent random variables which satisfy a spectral gap condition.

### Berry–Esseen bounds in the entropic central limit theorem

- Mathematics
- 2011

Berry–Esseen-type bounds for total variation and relative entropy distances to the normal law are established for the sums of non-i.i.d. random variables.

### Asymptotic Behavior of Rényi Entropy in the Central Limit Theorem

- MathematicsProgress in Probability
- 2019

We explore an asymptotic behavior of Renyi entropy along convolutions in the central limit theorem with respect to the increasing number of i.i.d. summands. In particular, the problem of monotonicity…

### Entropy Power Inequality for the Rényi Entropy

- Computer ScienceIEEE Transactions on Information Theory
- 2015

The classical entropy power inequality is extended to the Rényi entropy. We also discuss the question of the existence of the entropy for sums of independent random variables.

### Asymptotic behavior of R\'enyi entropy in the central limit theorem

- Mathematics
- 2018

We explore an asymptotic behavior of Rényi entropy along convolutions in the central limit theorem with respect to the increasing number of i.i.d. summands. In particular, the problem of monotonicity…

### Entropy and the Discrete Central Limit Theorem

- MathematicsArXiv
- 2021

A strengthened version of the central limit theorem for discrete random variables is established, relying only on information-theoretic tools and elementary arguments. It is shown that the relative…

### The Monotonicity of Information in the Central Limit Theorem and Entropy Power Inequalities

- Mathematics2006 IEEE International Symposium on Information Theory
- 2006

A simple proof of the monotonicity of information in the central limit theorem for i.i.d. summands is provided and new families of Fisher information and entropy power inequalities are discussed.

### Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem

- Mathematics
- 2013

An Edgeworth-type expansion is established for the entropy distance to the class of normal distributions of sums of i.i.d. random variables or vectors, satisfying minimal moment conditions.

### Fisher information and convergence to stable laws

- Mathematics
- 2012

The convergence to stable laws is studied in relative Fisher information for sums of i.i.d. random variables.

### Fisher information inequalities and the central limit theorem

- Mathematics
- 2004

Abstract.We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L2 spaces and Poincaré…

## References

SHOWING 1-10 OF 12 REFERENCES

### An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions

- Mathematics
- 1959

The proof is based on the “information functional” $l(\int_{ - \infty }^{ + \infty } {p(x)\log p(x)dx} + \tfrac{1} {2}\log {\bf D}(x))$ , $p(x)$ being the density of the random variable X. Some new…

### The convolution inequality for entropy powers

- Computer ScienceIEEE Trans. Inf. Theory
- 1965

An improved version of Stam's proof of Shannon's convolution inequality for entropy power is presented, which is obtained by mathematical induction from the one-dimensional case.

### Probability and Measure

- Mathematics
- 1979

Probability. Measure. Integration. Random Variables and Expected Values. Convergence of Distributions. Derivatives and Conditional Probability. Stochastic Processes. Appendix. Notes on the Problems.…

### Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon

- MathematicsInf. Control.
- 1959

### Information Theory and Reliable Communication

- Computer Science
- 1968

This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.

### $I$-Divergence Geometry of Probability Distributions and Minimization Problems

- Mathematics
- 1975

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and…

### Monotonic central limit theorem

- 1984