# An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method

@article{Sason2012AnIP, title={An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method}, author={Igal Sason}, journal={ArXiv}, year={2012}, volume={abs/1206.6811} }

The first part of this work considers the entropy of the sum of ( possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived via the Chen-Stein method. The second part of this work derives new lower bounds on the total variation distance and relative entropy between the distribution of the sum of independent Bernoulli random…

## 14 Citations

### On the entropy of sums of Bernoulli random variables via the Chen-Stein method

- Computer Science, Mathematics2012 IEEE Information Theory Workshop
- 2012

Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived and combines elements of information theory with the Chen-Stein method for Poisson approximation.

### Improved lower bounds on the total variation distance and relative entropy for the Poisson approximation

- Mathematics, Computer Science2013 Information Theory and Applications Workshop (ITA)
- 2013

New lower bounds on the total variation distance between the distribution of a sum of independent Bernoulli random variables and the Poisson random variable (with the same mean) are derived via the…

### Entropy bounds for discrete random variables via coupling

- Mathematics, Computer Science2013 IEEE International Symposium on Information Theory
- 2013

New bounds on the difference between the entropies of two discrete random variables in terms of the local and total variation distances between their probability mass functions are provided.

### Entropy Bounds for Discrete Random Variables via Maximal Coupling

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2013

New bounds on the difference of the entropies of two discrete random variables in terms of the local and total variation distances between their probability mass functions are derived.

### Stein’s density approach and information inequalities

- Mathematics, Computer Science
- 2012

A new perspective on Stein's so-called density approach is provided by introducing a new operator and characterizing class which are valid for a much wider family of probability distributions on the real line and proposing a new Stein identity which is used to derive information inequalities in terms of the "generalized Fisher information distance".

### On Improved Bounds for Probability Metrics and f-Divergences

- Computer Science
- 2014

Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to…

### Local Pinsker Inequalities via Stein's Discrete Density Approach

- MathematicsIEEE Transactions on Information Theory
- 2013

This paper introduces generalized Fisher information distances and proves that these also dominate the square of the total variation distance and introduces a general discrete Stein operator for which a useful covariance identity is proved.

### Concentration of Measure Inequalities in Information Theory, Communications, and Coding

- Computer ScienceFound. Trends Commun. Inf. Theory
- 2013

This third edition of the bestselling book introduces the reader to the martingale method and the Efron-Stein-Steele inequalities in completely new sections and includes various new recent results derived by the authors.

### Bounds on $f$-Divergences and Related Distances

- Computer Science
- 2014

An improved version of a reversed Pinsker’s inequal ity is derived for an arbitrary pair of probability distributions on a finite set and a new inequality for lossless source cod ing is derived and studied.

## References

SHOWING 1-10 OF 58 REFERENCES

### Sharp Bounds on the Entropy of the Poisson Law and Related Quantities

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2010

Upper and lower bounds for H(¿) are derived that are asymptotically tight and easy to compute and follow easily on the relative entropy D(n, p) between a binomial and a Poisson.

### Compound Poisson Approximation via Information Functionals

- Mathematics, Computer ScienceArXiv
- 2010

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are…

### Lower bounds on Information Divergence

- Mathematics, Computer ScienceArXiv
- 2011

Lower bounds on information divergence are established using mainly orthogonal polynomials and the related exponential families and for several convergence theorems where a rate of convergence can be computed, this rate is determined by the lower bounds proved in this paper.

### Poisson Approximation for Dependent Trials

- Mathematics
- 1975

by a Poisson distribution and a derivation of a bound on the distance between the distribution of W and the Poisson distribution with mean E(W ). This new method is based on previous work by C. Stein…

### Poisson approximation using the Stein-Chen method and coupling : number of exceedances of Gaussian random variables

- Mathematics
- 1990

Consider a family of (dependent) Gaussian random variables and count the number of them that exceed some given levels. An explicit upper bound is given for the total variation distance between the…

### Poisson Approximation and the Chen-Stein Method

- Mathematics
- 1990

The Chen-Stein method of Poisson approximation is a powerful tool for computing an error bound when approximating probabilities using the Poisson distribution. In many cases, this bound may be given…

### Two moments su ce for Poisson approx-imations: the Chen-Stein method

- Mathematics
- 1989

Convergence to the Poisson distribution, for the number of occurrences of dependent events, can often be established by computing only first and second moments, but not higher ones. This remarkable…

### The Interplay Between Entropy and Variational Distance

- Computer ScienceIEEE Transactions on Information Theory
- 2010

The relation between the Shannon entropy and variational distance, two fundamental and frequently-used quantities in information theory, is studied in this paper by means of certain bounds on the…

### Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

- MathematicsDiscret. Appl. Math.
- 2013

### Thinning, Entropy, and the Law of Thin Numbers

- MathematicsIEEE Transactions on Information Theory
- 2010

A “thinning Markov chain” is introduced, and it is shown to play a role analogous to that of the Ornstein-Uhlenbeck process in connection to the entropy power inequality.