Moments, Concentration, and Entropy of Log-Concave Distributions

@article{Marsiglietti2022MomentsCA,
  title={Moments, Concentration, and Entropy of Log-Concave Distributions},
  author={Arnaud Marsiglietti and James Melbourne},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.08293}
}
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in the convex order to derive moments, concentration, and entropy inequalities for log-concave random variables with respect to a reference measure. 

References

SHOWING 1-10 OF 24 REFERENCES

Concentration functions and entropy bounds for discrete log-concave distributions

Two-sided bounds are explored for concentration functions and Rényi entropies in the class of discrete log-concave probability distributions. They are used to derive certain variants of the entropy

Ultra-log-concavity and discrete degrees of freedom

. We develop the notion of discrete degrees of freedom of a log-concave sequence and use it to prove that the quantity P ( X = E X ) is maximized, under fixed integral mean, for a Poisson

Concentration inequalities for ultra log-concave distributions

. We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an

On the Maximum Entropy Properties of the Binomial Distribution

  • Yaming Yu
  • Mathematics
    IEEE Transactions on Information Theory
  • 2008
TLDR
It is shown that the Binomial(n,p) distribution maximizes the entropy in the class of ultra-log-concave distributions of order n with fixed mean np to show that the entropy never decreases along the iterations of this Markov chain.

Binomial and Poisson distributions as maximum entropy distributions

The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergence

Reversal of Rényi Entropy Inequalities Under Log-Concavity

TLDR
A discrete analog of the Rényi entropy comparison due to Bobkov and Madiman is established, and the entropic Rogers-Shephard inequality studied by Madiman and Kontoyannis is investigated.

Tail bounds for sums of geometric and exponential variables

On the Entropy of Compound Distributions on Nonnegative Integers

  • Yaming Yu
  • Mathematics
    IEEE Transactions on Information Theory
  • 2009
TLDR
Two recent results of Johnson (2008) on maximum entropy characterizations of compound Poisson and compound binomial distributions are proved under fewer assumptions and with simpler arguments.

Forward and Reverse Entropy Power Inequalities in Convex Geometry

TLDR
This work surveys various recent developments on forward and reverse entropy power inequalities not just for the Shannon-Boltzmann entropy but also more generally for Renyi entropy and discusses connections between the so-called functional and probabilistic analogues of some classical inequalities in geometric functional analysis.