On the entropy and log-concavity of compound Poisson measures
@article{Johnson2008OnTE, title={On the entropy and log-concavity of compound Poisson measures}, author={Oliver Johnson and Ioannis Kontoyiannis and Mokshay M. Madiman}, journal={ArXiv}, year={2008}, volume={abs/0805.4112} }
Motivated, in part, by the desire to develop an information-theoretic foundation for compound Poisson approximation limit theorems (analogous to the corresponding developments for the central limit theorem and for simple Poisson approximation), this work examines sufficient conditions under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. We show that the natural analog of the Poisson maximum entropy property…
15 Citations
Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- MathematicsDiscret. Appl. Math.
- 2013
A criterion for the compound poisson distribution to be maximum entropy
- Mathematics2009 IEEE International Symposium on Information Theory
- 2009
It is shown that the compound Poisson does indeed have a natural maximum entropy characterization when the distributions under consideration are log-concave, which complements the recent development by the same authors of an information-theoretic foundation for compoundPoisson approximation inequalities and limit theorems.
Compound Poisson Approximation via Information Functionals
- Mathematics, Computer ScienceArXiv
- 2010
An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are…
ENTROPY AND THE ‘ COMPOUND ’ LAW OF SMALL NUMBERS
- Mathematics
- 2008
An information-theoretic foundation for compound Poisson approximation limit theorems is presented, in analogy to the corresponding developments for the central limit theorem and for simple Poisson…
Monotonic Convergence in an Information-Theoretic Law of Small Numbers
- MathematicsIEEE Transactions on Information Theory
- 2009
Overall the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis and HarremoEumls is extended and monotonic convergence in relative entropy is established for general discrete distributions, whilemonotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions.
On the Entropy of Compound Distributions on Nonnegative Integers
- MathematicsIEEE Transactions on Information Theory
- 2009
Two recent results of Johnson (2008) on maximum entropy characterizations of compound Poisson and compound binomial distributions are proved under fewer assumptions and with simpler arguments.
The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
- MathematicsIEEE Transactions on Information Theory
- 2011
New constraints on entropy per coordinate are given for so-called convex or hyperbolic probability measures on Euclidean spaces, which generalize the results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.
Log-Hessian and Deviation Bounds for Markov Semi-Groups, and Regularization Effect in L 1
- MathematicsPotential Analysis
- 2021
It is well known that some important Markov semi-groups have a “regularization effect” – as for example th hypercontractivity property of the noise operator on the Boolean hypercube or the…
Proof of logconcavity of some compound Poisson and related distributions
- Mathematics
- 2010
Compound Poisson distributions play important role in many applications (telecommunication, hydrology, insurance, etc.). In this paper, we prove that some of the compound Poisson distributions have…
Selected topics in stochastic optimization by Anh Tuan Ninh
- Mathematics
- 2015
OF THE DISSERTATION Selected topics in stochastic optimization by Anh Tuan Ninh Dissertation Directors: András Prékopa Yao Zhao This report constitutes the Doctoral Dissertation for Anh Ninh and…
References
SHOWING 1-10 OF 28 REFERENCES
Compound Poisson Approximation for Nonnegative Random Variables Via Stein's Method
- Mathematics
- 1992
The aim of this paper is to extend Stein's method to a compound Poisson distribution setting. The compound Poisson distributions of concern here are those of the form POIS$(\nu)$, where $\nu$ is a…
Preservation of log-concavity on summation
- Computer Science, Mathematics
- 2006
The main theorem is used to give simple proofs of the log-concavity of the Stirling numbers of the second kind and of the Eulerian numbers and it is argued that these conditions are natural by giving some applications.
Thinning and the Law of Small Numbers
- Mathematics2007 IEEE International Symposium on Information Theory
- 2007
The role and properties of thinning are examined in the context of information-theoretic inequalities for Poisson approximation and the classical Binomial-to-Poisson convergence is seen to be a special case of a thinning limit theorem for convolutions of discrete distributions.
On the maximum entropy of the sum of two dependent random variables
- MathematicsIEEE Trans. Inf. Theory
- 1994
It is shown that max[h(X+Y)]=h(2X), under the constraints that X and Y have the same fixed marginal density f, if and only if f is log-concave, which should lead to capacity bounds for additive noise channels with feedback.
Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2007
A simple proof of the monotonicity of information in central limit theorems is obtained, both in theSetting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums.
Random Summation: Limit Theorems and Applications
- Mathematics
- 1996
Examples Examples Related to Generalized Poisson Laws A Remarkable Formula of Queueing Theory Other Examples Doubling with Repair Mathematical Model A Limit Theorem for the Trouble-Free Performance…
Solution of Shannon's problem on the monotonicity of entropy
- Mathematics, Computer Science
- 2004
It is shown that if X1, X2, . . . are independent and identically distributed square-integrable
random variables then the entropy of the normalized sum
Ent (X1+ · · · + Xn over √n) is an increasing…
Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Computer ScienceIEEE Transactions on Information Theory
- 2006
A simplified proof using the relationship between non-Gaussianness and minimum mean-square error (MMSE) in Gaussian channels and the more general setting of nonidentically distributed random variables is given.