On the Entropy of Compound Distributions on Nonnegative Integers

@article{Yu2009OnTE,
  title={On the Entropy of Compound Distributions on Nonnegative Integers},
  author={Yaming Yu},
  journal={IEEE Transactions on Information Theory},
  year={2009},
  volume={55},
  pages={3645-3650}
}
  • Yaming Yu
  • Published 1 August 2009
  • Mathematics
  • IEEE Transactions on Information Theory
Some entropy comparison results are presented concerning compound distributions on nonnegative integers. The main result shows that, under a log-concavity assumption, two compound distributions are ordered in terms of Shannon entropy if both the ldquonumbers of claimsrdquo and the ldquoclaim sizesrdquo are ordered accordingly in the convex order. Several maximum/minimum entropy theorems follow as a consequence. Most importantly, two recent results of Johnson (2008) on maximum entropy… 

Figures from this paper

Monotonic Convergence in an Information-Theoretic Law of Small Numbers

  • Yaming Yu
  • Mathematics
    IEEE Transactions on Information Theory
  • 2009
Overall the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis and HarremoEumls is extended and monotonic convergence in relative entropy is established for general discrete distributions, whilemonotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions.

Entropy and thinning of discrete random variables

We describe five types of results concerning information and concentration of discrete random variables, and relationships between them, motivated by their counterparts in the continuous case. The

Quantitative limit theorems via relative log-concavity

. In this paper we develop tools for studying limit theorems by means of convexity. We establish bounds for the discrepancy in total variation between probability measures µ and ν such that ν is

On the entropy of sums of Bernoulli random variables via the Chen-Stein method

  • I. Sason
  • Computer Science, Mathematics
    2012 IEEE Information Theory Workshop
  • 2012
Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived and combines elements of information theory with the Chen-Stein method for Poisson approximation.

Forward and Reverse Entropy Power Inequalities in Convex Geometry

This work surveys various recent developments on forward and reverse entropy power inequalities not just for the Shannon-Boltzmann entropy but also more generally for Renyi entropy and discusses connections between the so-called functional and probabilistic analogues of some classical inequalities in geometric functional analysis.

Compound Poisson Approximation via Information Functionals

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are

Negative dependence and stochastic orderings

We explore negative dependence and stochastic orderings, showing that if an integer-valued random variable $W$ satisfies a certain negative dependence assumption, then $W$ is smaller (in the convex

Information in probability: Another information-theoretic proof of a finite de Finetti theorem

An upper bound on the relative entropy is derived between the distribution of the distribution in a sequence of exchangeable random variables, and an appropriate mixture over product distributions, using de Finetti’s classical representation theorem as a corollary.

An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method Igal

The analysis in this work combines elements of information theory with the Chen-Stein method for the Poisson approximation to derive new lower bounds on the total variation distance and relative entropy between the distribution of the sum of independent Bernoull i random variables and thePoisson distribution.

References

SHOWING 1-10 OF 37 REFERENCES

On the Maximum Entropy Properties of the Binomial Distribution

  • Yaming Yu
  • Mathematics
    IEEE Transactions on Information Theory
  • 2008
It is shown that the Binomial(n,p) distribution maximizes the entropy in the class of ultra-log-concave distributions of order n with fixed mean np to show that the entropy never decreases along the iterations of this Markov chain.

On the entropy and log-concavity of compound Poisson measures

It is shown that the natural analog of the Poisson maximum entropy property remains valid if the measures under consideration are log-concave, but that it fails in general.

Entropy inequalities for classes of probability distributions I. The univariate case

For a given parametric family of densities the member of maximal (or sometimes minimal) entropy is ascertained and a natural (partial) ordering over for which the entropy functional is monotone is determined.

Generalized Entropy Power Inequalities and Monotonicity Properties of Information

A simple proof of the monotonicity of information in central limit theorems is obtained, both in theSetting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums.

Monotonic Convergence in an Information-Theoretic Law of Small Numbers

  • Yaming Yu
  • Mathematics
    IEEE Transactions on Information Theory
  • 2009
Overall the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis and HarremoEumls is extended and monotonic convergence in relative entropy is established for general discrete distributions, whilemonotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions.

Solution of Shannon's problem on the monotonicity of entropy

It is shown that if X1, X2, . . . are independent and identically distributed square-integrable random variables then the entropy of the normalized sum Ent (X1+ · · · + Xn over √n) is an increasing

Entropy, compound Poisson approximation, log-Sobolev inequalities and measure concentration

A new logarithmic Sobolev inequality for the compound Poisson measure is derived and used to prove measure-concentration bounds for a large class of discrete distributions.

Compound Poisson Approximation for Nonnegative Random Variables Via Stein's Method

The aim of this paper is to extend Stein's method to a compound Poisson distribution setting. The compound Poisson distributions of concern here are those of the form POIS$(\nu)$, where $\nu$ is a

Fisher Information, Compound Poisson Approximation, and the Poisson Channel

The first results show that the scaled Fisher information also admits a minimum mean squared error characterization with respect to the Poisson channel, and that it satisfies a monotonicity property that parallels themonotonicity recently established for the central limit theorem in terms of Fisher information.