# Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

@article{Johnson2013LogconcavityUA, title={Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures}, author={Oliver Johnson and Ioannis Kontoyiannis and Mokshay M. Madiman}, journal={ArXiv}, year={2013}, volume={abs/0912.0581} }

## 44 Citations

### Geometric and functional inequalities for log-concave probability sequences

- Mathematics
- 2020

We investigate various geometric and functional inequalities for the class of log-concave probability sequences. We prove dilation inequalities for log-concave probability measures on the integers. A…

### The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions

- MathematicsIEEE Transactions on Information Theory
- 2011

New constraints on entropy per coordinate are given for so-called convex or hyperbolic probability measures on Euclidean spaces, which generalize the results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.

### On the entropy of sums of Bernoulli random variables via the Chen-Stein method

- Computer Science, Mathematics2012 IEEE Information Theory Workshop
- 2012

Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived and combines elements of information theory with the Chen-Stein method for Poisson approximation.

### Log-Hessian and Deviation Bounds for Markov Semi-Groups, and Regularization Effect in $\mathbb {L}^{1}$

- MathematicsPotential Analysis
- 2021

It is well known that some important Markov semi-groups have a “regularization effect” – as for example th hypercontractivity property of the noise operator on the Boolean hypercube or the…

### An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method

- Computer Science, MathematicsArXiv
- 2012

The analysis in this work combines elements of information theory with the Chen-Stein method for the Poisson approximation to derive new lower bounds on the total variation distance and relative entropy between the distribution of the sum of independent Bernoulli random variables and the Poison distribution.

### Log-Concavity and Strong Log-Concavity: a review.

- Computer ScienceStatistics surveys
- 2014

A new proof of Efron's theorem is provided using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013) and along the way connections between log-concavity and other areas of mathematics and statistics are reviewed.

### Combinatorial Entropy Power Inequalities: A Preliminary Study of the Stam Region

- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2019

It is shown that the class of fractionally superadditive set functions provides an outer bound to the Stam region, resolving a conjecture of Barron and Madiman.

### An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method Igal

- Computer Science
- 2012

The analysis in this work combines elements of information theory with the Chen-Stein method for the Poisson approximation to derive new lower bounds on the total variation distance and relative entropy between the distribution of the sum of independent Bernoull i random variables and thePoisson distribution.

### Information in probability: Another information-theoretic proof of a finite de Finetti theorem

- Computer ScienceArXiv
- 2022

An upper bound on the relative entropy is derived between the distribution of the distribution in a sequence of exchangeable random variables, and an appropriate mixture over product distributions, using de Finetti’s classical representation theorem as a corollary.

### Rogozin's convolution inequality for locally compact groups

- Mathematics
- 2017

General extensions of an inequality due to Rogozin, concerning the essential supremum of a convolution of probability density functions on the real line, are obtained. While a weak version of the…

## References

SHOWING 1-10 OF 63 REFERENCES

### On the entropy and log-concavity of compound Poisson measures

- MathematicsArXiv
- 2008

It is shown that the natural analog of the Poisson maximum entropy property remains valid if the measures under consideration are log-concave, but that it fails in general.

### The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions

- MathematicsIEEE Transactions on Information Theory
- 2011

New constraints on entropy per coordinate are given for so-called convex or hyperbolic probability measures on Euclidean spaces, which generalize the results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.

### On the Entropy of Compound Distributions on Nonnegative Integers

- MathematicsIEEE Transactions on Information Theory
- 2009

Two recent results of Johnson (2008) on maximum entropy characterizations of compound Poisson and compound binomial distributions are proved under fewer assumptions and with simpler arguments.

### Compound Poisson Approximation via Information Functionals

- Mathematics, Computer ScienceArXiv
- 2010

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are…

### Preservation of log-concavity on summation

- Computer Science, Mathematics
- 2006

The main theorem is used to give simple proofs of the log-concavity of the Stirling numbers of the second kind and of the Eulerian numbers and it is argued that these conditions are natural by giving some applications.

### On the maximum entropy of the sum of two dependent random variables

- MathematicsIEEE Trans. Inf. Theory
- 1994

It is shown that max[h(X+Y)]=h(2X), under the constraints that X and Y have the same fixed marginal density f, if and only if f is log-concave, which should lead to capacity bounds for additive noise channels with feedback.

### A strong log-concavity property for measures on Boolean algebras

- MathematicsJ. Comb. Theory, Ser. A
- 2011

### Negative dependence and the geometry of polynomials

- Mathematics
- 2007

We introduce the class of strongly Rayleigh probability measures by means of geometric properties of their generating polynomials that amount to the stability of the latter. This class covers…

### Entropy Computations via Analytic Depoissonization

- Computer Science, MathematicsIEEE Trans. Inf. Theory
- 1999

It is argued that analytic methods can offer new tools for information theory, especially for studying second-order asymptotics, and there has been a resurgence of interest and a few successful applications of analytic methods to a variety of problems of information theory.