# Moments, Concentration, and Entropy of Log-Concave Distributions

@article{Marsiglietti2022MomentsCA, title={Moments, Concentration, and Entropy of Log-Concave Distributions}, author={Arnaud Marsiglietti and James Melbourne}, journal={ArXiv}, year={2022}, volume={abs/2205.08293} }

We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in the convex order to derive moments, concentration, and entropy inequalities for log-concave random variables with respect to a reference measure.

## References

SHOWING 1-10 OF 19 REFERENCES

Concentration functions and entropy bounds for discrete log-concave distributions

- Computer ScienceComb. Probab. Comput.
- 2022

Two-sided bounds are explored for concentration functions and Rényi entropies in the class of discrete log-concave probability distributions. They are used to derive certain variants of the entropy…

Concentration inequalities for ultra log-concave distributions

- MathematicsStudia Mathematica
- 2022

We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an…

On the Maximum Entropy Properties of the Binomial Distribution

- MathematicsIEEE Transactions on Information Theory
- 2008

It is shown that the Binomial(n,p) distribution maximizes the entropy in the class of ultra-log-concave distributions of order n with fixed mean np to show that the entropy never decreases along the iterations of this Markov chain.

Binomial and Poisson distributions as maximum entropy distributions

- MathematicsIEEE Trans. Inf. Theory
- 2001

The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergence…

Reversal of Rényi Entropy Inequalities Under Log-Concavity

- Computer ScienceIEEE Transactions on Information Theory
- 2021

A discrete analog of the Rényi entropy comparison due to Bobkov and Madiman is established, and the entropic Rogers-Shephard inequality studied by Madiman and Kontoyannis is investigated.

On the Entropy of Compound Distributions on Nonnegative Integers

- MathematicsIEEE Transactions on Information Theory
- 2009

Two recent results of Johnson (2008) on maximum entropy characterizations of compound Poisson and compound binomial distributions are proved under fewer assumptions and with simpler arguments.

Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

- MathematicsDiscret. Appl. Math.
- 2013

Forward and Reverse Entropy Power Inequalities in Convex Geometry

- Computer ScienceArXiv
- 2016

This work surveys various recent developments on forward and reverse entropy power inequalities not just for the Shannon-Boltzmann entropy but also more generally for Renyi entropy and discusses connections between the so-called functional and probabilistic analogues of some classical inequalities in geometric functional analysis.

A discrete complement of Lyapunov's inequality and its information theoretic consequences

- MathematicsArXiv
- 2021

A reversal of Lyapunov’s inequality for monotone log-concave sequences is established, settling a conjecture of Havrilla-Tkocz and Melbourne-T kocz, and several information theoretic inequalities as consequences are derived.