Log-concavity and the maximum entropy property of the Poisson distribution


We prove that the Poisson distribution maximises entropy in the class of ultralog-concave distributions, extending a result of Harremoës. The proof uses ideas concerning log-concavity, and a semigroup action involving adding Poisson variables and thinning. We go on to show that the entropy is a concave function along this semigroup. 1 Maximum entropy distributions It is well-known that the distributions which maximise entropy under certain very natural conditions take a simple form. For example, among random variables with fixed mean and variance the entropy is maximised by the normal distribution. Similarly, for random variables with positive support and fixed mean, the entropy is maximised by the exponential distribution. The standard technique for proving such results uses the Gibbs inequality, and exploits the fact that, given a function f(x), and fixing Λ(p) = ∫ p(x)f(x)dx, the maximum entropy density is of the form α exp(−βf(x)). Example 1.1 For a density p with mean μ and variance σ, write φμ,σ2 for the density of a N(μ, σ) random variable, and define the function Λ(p) = − ∫ p(x) logφμ,σ2(x)dx. ∗Statistical Laboratory, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Rd, Cambridge, CB3 0WB, UK. Email: otj1000@cam.ac.uk Fax: +44 1223 337956 Phone: +44 1223 337946

Cite this paper

@inproceedings{Johnson2008LogconcavityAT, title={Log-concavity and the maximum entropy property of the Poisson distribution}, author={Oliver Johnson}, year={2008} }