# Log-concavity and the maximum entropy property of the Poisson distribution

@article{Johnson2007LogconcavityAT, title={Log-concavity and the maximum entropy property of the Poisson distribution}, author={Oliver Johnson}, journal={Stochastic Processes and their Applications}, year={2007}, volume={117}, pages={791-802} }

## 79 Citations

### On the Maximum Entropy Properties of the Binomial Distribution

- MathematicsIEEE Transactions on Information Theory
- 2008

It is shown that the Binomial(n,p) distribution maximizes the entropy in the class of ultra-log-concave distributions of order n with fixed mean np to show that the entropy never decreases along the iterations of this Markov chain.

### Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

- MathematicsDiscret. Appl. Math.
- 2013

### Ultra-log-concavity and discrete degrees of freedom

- MathematicsArXiv
- 2022

. We develop the notion of discrete degrees of freedom of a log-concave sequence and use it to prove that the quantity P ( X = E X ) is maximized, under ﬁxed integral mean, for a Poisson…

### Entropy and thinning of discrete random variables

- Mathematics, Computer ScienceArXiv
- 2015

We describe five types of results concerning information and concentration of discrete random variables, and relationships between them, motivated by their counterparts in the continuous case. The…

### On the Rényi Entropy of Log-Concave Sequences

- Computer Science2020 IEEE International Symposium on Information Theory (ISIT)
- 2020

A discrete analog of the Rényi entropy comparison due to Bobkov and Madiman is established for log-concave variables on the integers with the additional assumption that the variable is monotone, and a sharp bound of loge is obtained.

### A criterion for the compound poisson distribution to be maximum entropy

- Mathematics2009 IEEE International Symposium on Information Theory
- 2009

It is shown that the compound Poisson does indeed have a natural maximum entropy characterization when the distributions under consideration are log-concave, which complements the recent development by the same authors of an information-theoretic foundation for compoundPoisson approximation inequalities and limit theorems.

### On the entropy and log-concavity of compound Poisson measures

- MathematicsArXiv
- 2008

It is shown that the natural analog of the Poisson maximum entropy property remains valid if the measures under consideration are log-concave, but that it fails in general.

### Displacement convexity of entropy and related inequalities on graphs

- Mathematics
- 2012

We introduce the notion of an interpolating path on the set of probability measures on finite graphs. Using this notion, we first prove a displacement convexity property of entropy along such a path…

### Quantitative limit theorems via relative log-concavity

- Mathematics
- 2022

. In this paper we develop tools for studying limit theorems by means of convexity. We establish bounds for the discrepancy in total variation between probability measures µ and ν such that ν is…

### Concentration inequalities for ultra log-concave distributions

- MathematicsStudia Mathematica
- 2022

. We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an…

## References

SHOWING 1-10 OF 30 REFERENCES

### Preservation of log-concavity on summation

- Computer Science, Mathematics
- 2006

The main theorem is used to give simple proofs of the log-concavity of the Stirling numbers of the second kind and of the Eulerian numbers and it is argued that these conditions are natural by giving some applications.

### Ultra Logconcave Sequences and Negative Dependence

- MathematicsJ. Comb. Theory, Ser. A
- 1997

It is proved that the convolution of two ultra-logconcave sequences isUltra-log-conc Cave and implies that a natural negative dependence property is preserved under the operation of “joining” families of exchangeable Bernoulli random variables.

### Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution

- Mathematics
- 1981

### Binomial and Poisson distributions as maximum entropy distributions

- MathematicsIEEE Trans. Inf. Theory
- 2001

The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergence…

### A new entropy power inequality

- Mathematics, Computer ScienceIEEE Trans. Inf. Theory
- 1985

A strengthened version of Shannon's entropy power inequality for the case where one of the random vectors involved is Gaussian is proved. In particular it is shown that if independent Gaussian noise…

### BINOMIAL-POISSON ENTROPIC INEQUALITIES AND THE M/M/∞ QUEUE

- Mathematics
- 2006

This article provides entropic inequalities for binomial-Poisson distributions, derived from the two point space. They appear as local inequalities of the M/M/∞ queue. They describe in particular the…

### Maximum entropy versus minimum risk and applications to some classical discrete distributions

- MathematicsIEEE Trans. Inf. Theory
- 2002

The game which can be taken to lie behind the maximum-entropy principle is studied and new theoretical results are obtained.

### The convolution inequality for entropy powers

- Computer ScienceIEEE Trans. Inf. Theory
- 1965

An improved version of Stam's proof of Shannon's convolution inequality for entropy power is presented, which is obtained by mathematical induction from the one-dimensional case.

### Pólya sequences, binomial convolution and the union of random sets

- MathematicsJournal of Applied Probability
- 1976

A basic result in the theory of total positivity is that the convolution of any two Pólya frequency sequences is again a Pólya frequency sequence. The like result for binomial convolution, associated…