• Corpus ID: 9224342

# On the entropy and log-concavity of compound Poisson measures

@article{Johnson2008OnTE,
title={On the entropy and log-concavity of compound Poisson measures},
author={Oliver Johnson and Ioannis Kontoyiannis and Mokshay M. Madiman},
journal={ArXiv},
year={2008},
volume={abs/0805.4112}
}
• Published 27 May 2008
• Mathematics
• ArXiv
Motivated, in part, by the desire to develop an information-theoretic foundation for compound Poisson approximation limit theorems (analogous to the corresponding developments for the central limit theorem and for simple Poisson approximation), this work examines sufficient conditions under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. We show that the natural analog of the Poisson maximum entropy property…
• Mathematics
2009 IEEE International Symposium on Information Theory
• 2009
It is shown that the compound Poisson does indeed have a natural maximum entropy characterization when the distributions under consideration are log-concave, which complements the recent development by the same authors of an information-theoretic foundation for compoundPoisson approximation inequalities and limit theorems.
• Mathematics, Computer Science
ArXiv
• 2010
An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are
• Mathematics
• 2008
An information-theoretic foundation for compound Poisson approximation limit theorems is presented, in analogy to the corresponding developments for the central limit theorem and for simple Poisson
• Yaming Yu
• Mathematics
IEEE Transactions on Information Theory
• 2009
Overall the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis and HarremoEumls is extended and monotonic convergence in relative entropy is established for general discrete distributions, whilemonotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions.
• Yaming Yu
• Mathematics
IEEE Transactions on Information Theory
• 2009
Two recent results of Johnson (2008) on maximum entropy characterizations of compound Poisson and compound binomial distributions are proved under fewer assumptions and with simpler arguments.
• Mathematics
IEEE Transactions on Information Theory
• 2011
New constraints on entropy per coordinate are given for so-called convex or hyperbolic probability measures on Euclidean spaces, which generalize the results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.
• Mathematics
Potential Analysis
• 2021
It is well known that some important Markov semi-groups have a “regularization effect” – as for example th hypercontractivity property of the noise operator on the Boolean hypercube or the
• Mathematics
• 2010
Compound Poisson distributions play important role in many applications (telecommunication, hydrology, insurance, etc.). In this paper, we prove that some of the compound Poisson distributions have
OF THE DISSERTATION Selected topics in stochastic optimization by Anh Tuan Ninh Dissertation Directors: András Prékopa Yao Zhao This report constitutes the Doctoral Dissertation for Anh Ninh and

## References

SHOWING 1-10 OF 28 REFERENCES

• Mathematics
• 1992
The aim of this paper is to extend Stein's method to a compound Poisson distribution setting. The compound Poisson distributions of concern here are those of the form POIS$(\nu)$, where $\nu$ is a
• Computer Science, Mathematics
• 2006
The main theorem is used to give simple proofs of the log-concavity of the Stirling numbers of the second kind and of the Eulerian numbers and it is argued that these conditions are natural by giving some applications.
• Mathematics
2007 IEEE International Symposium on Information Theory
• 2007
The role and properties of thinning are examined in the context of information-theoretic inequalities for Poisson approximation and the classical Binomial-to-Poisson convergence is seen to be a special case of a thinning limit theorem for convolutions of discrete distributions.
• Mathematics
IEEE Trans. Inf. Theory
• 1994
It is shown that max[h(X+Y)]=h(2X), under the constraints that X and Y have the same fixed marginal density f, if and only if f is log-concave, which should lead to capacity bounds for additive noise channels with feedback.
• Mathematics, Computer Science
IEEE Transactions on Information Theory
• 2007
A simple proof of the monotonicity of information in central limit theorems is obtained, both in theSetting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums.
• Mathematics
• 1996
Examples Examples Related to Generalized Poisson Laws A Remarkable Formula of Queueing Theory Other Examples Doubling with Repair Mathematical Model A Limit Theorem for the Trouble-Free Performance
• Mathematics, Computer Science
• 2004
It is shown that if X1, X2, . . . are independent and identically distributed square-integrable random variables then the entropy of the normalized sum Ent (X1+ · · · + Xn over √n) is an increasing
• Computer Science
IEEE Transactions on Information Theory
• 2006
A simplified proof using the relationship between non-Gaussianness and minimum mean-square error (MMSE) in Gaussian channels and the more general setting of nonidentically distributed random variables is given.