• Corpus ID: 5818275

An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method

@article{Sason2012AnIP,
  title={An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method},
  author={Igal Sason},
  journal={ArXiv},
  year={2012},
  volume={abs/1206.6811}
}
  • I. Sason
  • Published 28 June 2012
  • Computer Science, Mathematics
  • ArXiv
The first part of this work considers the entropy of the sum of ( possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived via the Chen-Stein method. The second part of this work derives new lower bounds on the total variation distance and relative entropy between the distribution of the sum of independent Bernoulli random… 

Figures and Tables from this paper

On the entropy of sums of Bernoulli random variables via the Chen-Stein method

  • I. Sason
  • Computer Science, Mathematics
    2012 IEEE Information Theory Workshop
  • 2012
Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived and combines elements of information theory with the Chen-Stein method for Poisson approximation.

Improved lower bounds on the total variation distance and relative entropy for the Poisson approximation

  • I. Sason
  • Mathematics, Computer Science
    2013 Information Theory and Applications Workshop (ITA)
  • 2013
New lower bounds on the total variation distance between the distribution of a sum of independent Bernoulli random variables and the Poisson random variable (with the same mean) are derived via the

Entropy and the fourth moment phenomenon

Entropy bounds for discrete random variables via coupling

  • I. Sason
  • Mathematics, Computer Science
    2013 IEEE International Symposium on Information Theory
  • 2013
New bounds on the difference between the entropies of two discrete random variables in terms of the local and total variation distances between their probability mass functions are provided.

Entropy Bounds for Discrete Random Variables via Maximal Coupling

  • I. Sason
  • Mathematics, Computer Science
    IEEE Transactions on Information Theory
  • 2013
New bounds on the difference of the entropies of two discrete random variables in terms of the local and total variation distances between their probability mass functions are derived.

Stein’s density approach and information inequalities

A new perspective on Stein's so-called density approach is provided by introducing a new operator and characterizing class which are valid for a much wider family of probability distributions on the real line and proposing a new Stein identity which is used to derive information inequalities in terms of the "generalized Fisher information distance".

On Improved Bounds for Probability Metrics and f-Divergences

Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to

Local Pinsker Inequalities via Stein's Discrete Density Approach

This paper introduces generalized Fisher information distances and proves that these also dominate the square of the total variation distance and introduces a general discrete Stein operator for which a useful covariance identity is proved.

Concentration of Measure Inequalities in Information Theory, Communications, and Coding

This third edition of the bestselling book introduces the reader to the martingale method and the Efron-Stein-Steele inequalities in completely new sections and includes various new recent results derived by the authors.

Bounds on $f$-Divergences and Related Distances

An improved version of a reversed Pinsker’s inequal ity is derived for an arbitrary pair of probability distributions on a finite set and a new inequality for lossless source cod ing is derived and studied.

References

SHOWING 1-10 OF 58 REFERENCES

Sharp Bounds on the Entropy of the Poisson Law and Related Quantities

Upper and lower bounds for H(¿) are derived that are asymptotically tight and easy to compute and follow easily on the relative entropy D(n, p) between a binomial and a Poisson.

Compound Poisson Approximation via Information Functionals

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are

Lower bounds on Information Divergence

Lower bounds on information divergence are established using mainly orthogonal polynomials and the related exponential families and for several convergence theorems where a rate of convergence can be computed, this rate is determined by the lower bounds proved in this paper.

Poisson Approximation for Dependent Trials

by a Poisson distribution and a derivation of a bound on the distance between the distribution of W and the Poisson distribution with mean E(W ). This new method is based on previous work by C. Stein

Poisson approximation using the Stein-Chen method and coupling : number of exceedances of Gaussian random variables

Consider a family of (dependent) Gaussian random variables and count the number of them that exceed some given levels. An explicit upper bound is given for the total variation distance between the

Poisson Approximation and the Chen-Stein Method

The Chen-Stein method of Poisson approximation is a powerful tool for computing an error bound when approximating probabilities using the Poisson distribution. In many cases, this bound may be given

Two moments su ce for Poisson approx-imations: the Chen-Stein method

Convergence to the Poisson distribution, for the number of occurrences of dependent events, can often be established by computing only first and second moments, but not higher ones. This remarkable

The Interplay Between Entropy and Variational Distance

The relation between the Shannon entropy and variational distance, two fundamental and frequently-used quantities in information theory, is studied in this paper by means of certain bounds on the

Thinning, Entropy, and the Law of Thin Numbers

A “thinning Markov chain” is introduced, and it is shown to play a role analogous to that of the Ornstein-Uhlenbeck process in connection to the entropy power inequality.
...