# On the entropy of sums of Bernoulli random variables via the Chen-Stein method

@article{Sason2012OnTE, title={On the entropy of sums of Bernoulli random variables via the Chen-Stein method}, author={Igal Sason}, journal={2012 IEEE Information Theory Workshop}, year={2012}, pages={542-546} }

This paper considers the entropy of the sum of (possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived. The derivation of these bounds combines elements of information theory with the Chen-Stein method for Poisson approximation. The resulting bounds are easy to compute, and their applicability is exemplified. This…

## 6 Citations

### Entropy bounds for discrete random variables via coupling

- Mathematics, Computer Science2013 IEEE International Symposium on Information Theory
- 2013

New bounds on the difference between the entropies of two discrete random variables in terms of the local and total variation distances between their probability mass functions are provided.

### Entropy Bounds for Discrete Random Variables via Maximal Coupling

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2013

New bounds on the difference of the entropies of two discrete random variables in terms of the local and total variation distances between their probability mass functions are derived.

### Local Pinsker Inequalities via Stein's Discrete Density Approach

- MathematicsIEEE Transactions on Information Theory
- 2013

This paper introduces generalized Fisher information distances and proves that these also dominate the square of the total variation distance and introduces a general discrete Stein operator for which a useful covariance identity is proved.

### Stein’s density approach and information inequalities

- Mathematics, Computer Science
- 2012

A new perspective on Stein's so-called density approach is provided by introducing a new operator and characterizing class which are valid for a much wider family of probability distributions on the real line and proposing a new Stein identity which is used to derive information inequalities in terms of the "generalized Fisher information distance".

### Integration by parts and representation of information functionals

- Mathematics, Computer Science2014 IEEE International Symposium on Information Theory
- 2014

A new formalism for computing expectations of functionals of arbitrary random vectors is introduced, by using generalised integration by parts formulae, which derives a representation for the standardised Fisher information of sums of i.i.d. random vectors and a Stein bound for Fisher information distance.

### Cleaning Method for Status Monitoring Data of Power Equipment Based on Stacked Denoising Autoencoders

- Computer ScienceIEEE Access
- 2017

The results show that the proposed data cleaning method based on stacked denoising autoencoder networks can effectively identify and repair outliers and missing information and can perform rapid anomaly detection when the equipment is running abnormally.

## References

SHOWING 1-10 OF 49 REFERENCES

### Sharp Bounds on the Entropy of the Poisson Law and Related Quantities

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2010

Upper and lower bounds for H(¿) are derived that are asymptotically tight and easy to compute and follow easily on the relative entropy D(n, p) between a binomial and a Poisson.

### Entropy bounds for discrete random variables via coupling

- Mathematics, Computer Science2013 IEEE International Symposium on Information Theory
- 2013

New bounds on the difference between the entropies of two discrete random variables in terms of the local and total variation distances between their probability mass functions are provided.

### Compound Poisson Approximation via Information Functionals

- Mathematics, Computer ScienceArXiv
- 2010

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are…

### Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality

- Computer ScienceIEEE Transactions on Information Theory
- 2010

A stronger version of concavity of entropy is proved, which implies a strengthened form of Shannon's discrete EPI, which gives a sharp bound on how the entropy of ultra log-concave random variables behaves on thinning.

### Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

- MathematicsDiscret. Appl. Math.
- 2013

### The Interplay Between Entropy and Variational Distance

- Computer ScienceIEEE Transactions on Information Theory
- 2010

The relation between the Shannon entropy and variational distance, two fundamental and frequently-used quantities in information theory, is studied in this paper by means of certain bounds on the…

### Thinning, Entropy, and the Law of Thin Numbers

- MathematicsIEEE Transactions on Information Theory
- 2010

A “thinning Markov chain” is introduced, and it is shown to play a role analogous to that of the Ornstein-Uhlenbeck process in connection to the entropy power inequality.

### On the Maximum Entropy Properties of the Binomial Distribution

- MathematicsIEEE Transactions on Information Theory
- 2008

It is shown that the Binomial(n,p) distribution maximizes the entropy in the class of ultra-log-concave distributions of order n with fixed mean np to show that the entropy never decreases along the iterations of this Markov chain.

### Poisson Approximation for Dependent Trials

- Mathematics
- 1975

by a Poisson distribution and a derivation of a bound on the distance between the distribution of W and the Poisson distribution with mean E(W ). This new method is based on previous work by C. Stein…