Log-concavity and the maximum entropy property of the Poisson distribution

@article{Johnson2007LogconcavityAT,
  title={Log-concavity and the maximum entropy property of the Poisson distribution},
  author={Oliver Johnson},
  journal={Stochastic Processes and their Applications},
  year={2007},
  volume={117},
  pages={791-802}
}
  • O. Johnson
  • Published 28 March 2006
  • Mathematics
  • Stochastic Processes and their Applications

On the Maximum Entropy Properties of the Binomial Distribution

  • Yaming Yu
  • Mathematics
    IEEE Transactions on Information Theory
  • 2008
It is shown that the Binomial(n,p) distribution maximizes the entropy in the class of ultra-log-concave distributions of order n with fixed mean np to show that the entropy never decreases along the iterations of this Markov chain.

Ultra-log-concavity and discrete degrees of freedom

. We develop the notion of discrete degrees of freedom of a log-concave sequence and use it to prove that the quantity P ( X = E X ) is maximized, under fixed integral mean, for a Poisson

Entropy and thinning of discrete random variables

We describe five types of results concerning information and concentration of discrete random variables, and relationships between them, motivated by their counterparts in the continuous case. The

On the Rényi Entropy of Log-Concave Sequences

  • J. MelbourneT. Tkocz
  • Computer Science
    2020 IEEE International Symposium on Information Theory (ISIT)
  • 2020
A discrete analog of the Rényi entropy comparison due to Bobkov and Madiman is established for log-concave variables on the integers with the additional assumption that the variable is monotone, and a sharp bound of loge is obtained.

A criterion for the compound poisson distribution to be maximum entropy

It is shown that the compound Poisson does indeed have a natural maximum entropy characterization when the distributions under consideration are log-concave, which complements the recent development by the same authors of an information-theoretic foundation for compoundPoisson approximation inequalities and limit theorems.

On the entropy and log-concavity of compound Poisson measures

It is shown that the natural analog of the Poisson maximum entropy property remains valid if the measures under consideration are log-concave, but that it fails in general.

Displacement convexity of entropy and related inequalities on graphs

We introduce the notion of an interpolating path on the set of probability measures on finite graphs. Using this notion, we first prove a displacement convexity property of entropy along such a path

Quantitative limit theorems via relative log-concavity

. In this paper we develop tools for studying limit theorems by means of convexity. We establish bounds for the discrepancy in total variation between probability measures µ and ν such that ν is

Concentration inequalities for ultra log-concave distributions

. We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an
...

References

SHOWING 1-10 OF 30 REFERENCES

Preservation of log-concavity on summation

The main theorem is used to give simple proofs of the log-concavity of the Stirling numbers of the second kind and of the Eulerian numbers and it is argued that these conditions are natural by giving some applications.

Ultra Logconcave Sequences and Negative Dependence

It is proved that the convolution of two ultra-logconcave sequences isUltra-log-conc Cave and implies that a natural negative dependence property is preserved under the operation of “joining” families of exchangeable Bernoulli random variables.

Binomial and Poisson distributions as maximum entropy distributions

The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergence

A new entropy power inequality

  • M. H. M. Costa
  • Mathematics, Computer Science
    IEEE Trans. Inf. Theory
  • 1985
A strengthened version of Shannon's entropy power inequality for the case where one of the random vectors involved is Gaussian is proved. In particular it is shown that if independent Gaussian noise

BINOMIAL-POISSON ENTROPIC INEQUALITIES AND THE M/M/∞ QUEUE

This article provides entropic inequalities for binomial-Poisson distributions, derived from the two point space. They appear as local inequalities of the M/M/∞ queue. They describe in particular the

Chromatic polynomials and logarithmic concavity

Maximum entropy versus minimum risk and applications to some classical discrete distributions

The game which can be taken to lie behind the maximum-entropy principle is studied and new theoretical results are obtained.

The convolution inequality for entropy powers

An improved version of Stam's proof of Shannon's convolution inequality for entropy power is presented, which is obtained by mathematical induction from the one-dimensional case.

Pólya sequences, binomial convolution and the union of random sets

  • D. Walkup
  • Mathematics
    Journal of Applied Probability
  • 1976
A basic result in the theory of total positivity is that the convolution of any two Pólya frequency sequences is again a Pólya frequency sequence. The like result for binomial convolution, associated