# On the Entropy of Compound Distributions on Nonnegative Integers

@article{Yu2009OnTE, title={On the Entropy of Compound Distributions on Nonnegative Integers}, author={Yaming Yu}, journal={IEEE Transactions on Information Theory}, year={2009}, volume={55}, pages={3645-3650} }

Some entropy comparison results are presented concerning compound distributions on nonnegative integers. The main result shows that, under a log-concavity assumption, two compound distributions are ordered in terms of Shannon entropy if both the ldquonumbers of claimsrdquo and the ldquoclaim sizesrdquo are ordered accordingly in the convex order. Several maximum/minimum entropy theorems follow as a consequence. Most importantly, two recent results of Johnson (2008) on maximum entropy…

## Figures from this paper

## 29 Citations

### Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

- MathematicsDiscret. Appl. Math.
- 2013

### Monotonic Convergence in an Information-Theoretic Law of Small Numbers

- MathematicsIEEE Transactions on Information Theory
- 2009

Overall the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis and HarremoEumls is extended and monotonic convergence in relative entropy is established for general discrete distributions, whilemonotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions.

### Entropy and thinning of discrete random variables

- Mathematics, Computer ScienceArXiv
- 2015

We describe five types of results concerning information and concentration of discrete random variables, and relationships between them, motivated by their counterparts in the continuous case. The…

### On the entropy of sums of Bernoulli random variables via the Chen-Stein method

- Computer Science, Mathematics2012 IEEE Information Theory Workshop
- 2012

Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived and combines elements of information theory with the Chen-Stein method for Poisson approximation.

### Forward and Reverse Entropy Power Inequalities in Convex Geometry

- Computer ScienceArXiv
- 2016

This work surveys various recent developments on forward and reverse entropy power inequalities not just for the Shannon-Boltzmann entropy but also more generally for Renyi entropy and discusses connections between the so-called functional and probabilistic analogues of some classical inequalities in geometric functional analysis.

### Compound Poisson Approximation via Information Functionals

- Mathematics, Computer ScienceArXiv
- 2010

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are…

### Negative dependence and stochastic orderings

- Mathematics
- 2015

We explore negative dependence and stochastic orderings, showing that if an integer-valued random variable $W$ satisfies a certain negative dependence assumption, then $W$ is smaller (in the convex…

### Information in probability: Another information-theoretic proof of a finite de Finetti theorem

- Computer ScienceArXiv
- 2022

An upper bound on the relative entropy is derived between the distribution of the distribution in a sequence of exchangeable random variables, and an appropriate mixture over product distributions, using de Finetti’s classical representation theorem as a corollary.

### An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method Igal

- Computer Science
- 2012

The analysis in this work combines elements of information theory with the Chen-Stein method for the Poisson approximation to derive new lower bounds on the total variation distance and relative entropy between the distribution of the sum of independent Bernoull i random variables and thePoisson distribution.

### An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method

- Computer Science, MathematicsArXiv
- 2012

The analysis in this work combines elements of information theory with the Chen-Stein method for the Poisson approximation to derive new lower bounds on the total variation distance and relative entropy between the distribution of the sum of independent Bernoulli random variables and the Poison distribution.

## References

SHOWING 1-10 OF 37 REFERENCES

### On the Maximum Entropy Properties of the Binomial Distribution

- MathematicsIEEE Transactions on Information Theory
- 2008

It is shown that the Binomial(n,p) distribution maximizes the entropy in the class of ultra-log-concave distributions of order n with fixed mean np to show that the entropy never decreases along the iterations of this Markov chain.

### On the entropy and log-concavity of compound Poisson measures

- MathematicsArXiv
- 2008

It is shown that the natural analog of the Poisson maximum entropy property remains valid if the measures under consideration are log-concave, but that it fails in general.

### Entropy inequalities for classes of probability distributions I. The univariate case

- Mathematics, Computer ScienceAdvances in Applied Probability
- 1981

For a given parametric family of densities the member of maximal (or sometimes minimal) entropy is ascertained and a natural (partial) ordering over for which the entropy functional is monotone is determined.

### Generalized Entropy Power Inequalities and Monotonicity Properties of Information

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2007

A simple proof of the monotonicity of information in central limit theorems is obtained, both in theSetting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums.

### Monotonic Convergence in an Information-Theoretic Law of Small Numbers

- MathematicsIEEE Transactions on Information Theory
- 2009

Overall the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis and HarremoEumls is extended and monotonic convergence in relative entropy is established for general discrete distributions, whilemonotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions.

### Solution of Shannon's problem on the monotonicity of entropy

- Mathematics, Computer Science
- 2004

It is shown that if X1, X2, . . . are independent and identically distributed square-integrable
random variables then the entropy of the normalized sum
Ent (X1+ · · · + Xn over √n) is an increasing…

### Entropy, compound Poisson approximation, log-Sobolev inequalities and measure concentration

- Mathematics, Computer ScienceInformation Theory Workshop
- 2004

A new logarithmic Sobolev inequality for the compound Poisson measure is derived and used to prove measure-concentration bounds for a large class of discrete distributions.

### Compound Poisson Approximation for Nonnegative Random Variables Via Stein's Method

- Mathematics
- 1992

The aim of this paper is to extend Stein's method to a compound Poisson distribution setting. The compound Poisson distributions of concern here are those of the form POIS$(\nu)$, where $\nu$ is a…

### Fisher Information, Compound Poisson Approximation, and the Poisson Channel

- Mathematics, Computer Science2007 IEEE International Symposium on Information Theory
- 2007

The first results show that the scaled Fisher information also admits a minimum mean squared error characterization with respect to the Poisson channel, and that it satisfies a monotonicity property that parallels themonotonicity recently established for the central limit theorem in terms of Fisher information.