# Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

@article{Johnson2009LogconcavityUA, title={Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures}, author={Oliver Johnson and Ioannis Kontoyiannis and Mokshay M. Madiman}, journal={ArXiv}, year={2009}, volume={abs/0912.0581} }

## 46 Citations

### Geometric and functional inequalities for log-concave probability sequences

- Mathematics
- 2020

We investigate various geometric and functional inequalities for the class of log-concave probability sequences. We prove dilation inequalities for log-concave probability measures on the integers. A…

### The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions

- MathematicsIEEE Transactions on Information Theory
- 2011

New constraints on entropy per coordinate are given for so-called convex or hyperbolic probability measures on Euclidean spaces, which generalize the results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.

### On the entropy of sums of Bernoulli random variables via the Chen-Stein method

- Computer Science, Mathematics2012 IEEE Information Theory Workshop
- 2012

Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived and combines elements of information theory with the Chen-Stein method for Poisson approximation.

### Log-Hessian and Deviation Bounds for Markov Semi-Groups, and Regularization Effect in $\mathbb {L}^{1}$

- MathematicsPotential Analysis
- 2021

It is well known that some important Markov semi-groups have a “regularization effect” – as for example th hypercontractivity property of the noise operator on the Boolean hypercube or the…

### An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method

- Computer Science, MathematicsArXiv
- 2012

The analysis in this work combines elements of information theory with the Chen-Stein method for the Poisson approximation to derive new lower bounds on the total variation distance and relative entropy between the distribution of the sum of independent Bernoulli random variables and the Poison distribution.

### Log-Concavity and Strong Log-Concavity: a review.

- Computer ScienceStatistics surveys
- 2014

A new proof of Efron's theorem is provided using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013) and along the way connections between log-concavity and other areas of mathematics and statistics are reviewed.

### Combinatorial Entropy Power Inequalities: A Preliminary Study of the Stam Region

- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2019

It is shown that the class of fractionally superadditive set functions provides an outer bound to the Stam region, resolving a conjecture of Barron and Madiman.

### Reversal of Rényi Entropy Inequalities Under Log-Concavity

- Computer ScienceIEEE Transactions on Information Theory
- 2021

A discrete analog of the Rényi entropy comparison due to Bobkov and Madiman is established, and the entropic Rogers-Shephard inequality studied by Madiman and Kontoyannis is investigated.

### An Information-Theoretic Perspective of the Poisson Approximation via the Chen-Stein Method Igal

- Computer Science
- 2012

The analysis in this work combines elements of information theory with the Chen-Stein method for the Poisson approximation to derive new lower bounds on the total variation distance and relative entropy between the distribution of the sum of independent Bernoull i random variables and thePoisson distribution.

### Quantitative limit theorems via relative log-concavity

- Mathematics
- 2022

. In this paper we develop tools for studying limit theorems by means of convexity. We establish bounds for the discrepancy in total variation between probability measures µ and ν such that ν is…

## References

SHOWING 1-10 OF 63 REFERENCES

### On the entropy and log-concavity of compound Poisson measures

- MathematicsArXiv
- 2008

It is shown that the natural analog of the Poisson maximum entropy property remains valid if the measures under consideration are log-concave, but that it fails in general.

### The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions

- MathematicsIEEE Transactions on Information Theory
- 2011

New constraints on entropy per coordinate are given for so-called convex or hyperbolic probability measures on Euclidean spaces, which generalize the results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.

### On the Entropy of Compound Distributions on Nonnegative Integers

- MathematicsIEEE Transactions on Information Theory
- 2009

Two recent results of Johnson (2008) on maximum entropy characterizations of compound Poisson and compound binomial distributions are proved under fewer assumptions and with simpler arguments.

### Compound Poisson Approximation via Information Functionals

- Mathematics, Computer ScienceArXiv
- 2010

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are…

### On the maximum entropy of the sum of two dependent random variables

- MathematicsIEEE Trans. Inf. Theory
- 1994

It is shown that max[h(X+Y)]=h(2X), under the constraints that X and Y have the same fixed marginal density f, if and only if f is log-concave, which should lead to capacity bounds for additive noise channels with feedback.

### A strong log-concavity property for measures on Boolean algebras

- MathematicsJ. Comb. Theory, Ser. A
- 2011

### Negative dependence and the geometry of polynomials

- Mathematics
- 2007

We introduce the class of strongly Rayleigh probability measures by means of geometric properties of their generating polynomials that amount to the stability of the latter. This class covers…

### Entropy Computations via Analytic Depoissonization

- Computer Science, MathematicsIEEE Trans. Inf. Theory
- 1999

It is argued that analytic methods can offer new tools for information theory, especially for studying second-order asymptotics, and there has been a resurgence of interest and a few successful applications of analytic methods to a variety of problems of information theory.

### Entropy and set cardinality inequalities for partition‐determined functions

- MathematicsRandom Struct. Algorithms
- 2012

A new notion of partition‐determined functions is introduced, and several basic inequalities are developed for the entropies of such functions of independent random variables, as well as for…