# Reversal of Rényi Entropy Inequalities Under Log-Concavity

```@article{Melbourne2020ReversalOR,
title={Reversal of R{\'e}nyi Entropy Inequalities Under Log-Concavity},
author={James Melbourne and Tomasz Tkocz},
journal={IEEE Transactions on Information Theory},
year={2020},
volume={67},
pages={45-51}
}```
• Published 21 May 2020
• Computer Science
• IEEE Transactions on Information Theory
We establish a discrete analog of the Rényi entropy comparison due to Bobkov and Madiman. For log-concave variables on the integers, the min entropy is within <inline-formula> <tex-math notation="LaTeX">\$\log e\$ </tex-math></inline-formula> of the usual Shannon entropy. Additionally we investigate the entropic Rogers-Shephard inequality studied by Madiman and Kontoyannis, and establish a sharp Rényi version for certain parameters in both the continuous and discrete cases.
A discrete version of the notion of degree of freedom is utilized to prove a sharp min-entropy-variance inequality for integer valued log-concave random variables and it is shown that the geometric distribution minimizes the min-Entropy within the class of log- Concave probability sequences with variance.
• Mathematics
ArXiv
• 2021
A reversal of Lyapunov’s inequality for monotone log-concave sequences is established, settling a conjecture of Havrilla-Tkocz and Melbourne-T kocz, and several information theoretic inequalities as consequences are derived.
• Mathematics
Journal of Theoretical Probability
• 2022
Several Schur-convexity type results under fixed variance for weighted sums of independent gamma random variables are established and nonasymptotic bounds on their Rényi entropies are obtained.
• Mathematics
• 2022
A remarkable conjecture of Feige (2006) asserts that for any collection of n independent non-negative random variables X 1 , X 2 , . . . , X n , each with expectation at most 1, P ( X < E [ X ] + 1)
We give an alternative proof for discrete Brunn-Minkowski type inequalities, recently obtained by Halikias, Klartag and the author. This proof also implies stronger weighted versions of these
• Computer Science
Comb. Probab. Comput.
• 2022
Two-sided bounds are explored for concentration functions and Rényi entropies in the class of discrete log-concave probability distributions. They are used to derive certain variants of the entropy
• Economics
ArXiv
• 2022
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in the convex order to derive moments, concentration, and entropy inequalities for log-concave random
• Mathematics
Mathematics
• 2020
We consider a probability distribution p0(x),p1(x),… depending on a real parameter x. The associated information potential is S(x):=∑kpk2(x). The Rényi entropy and the Tsallis entropy of order 2 can
• Mathematics
Studia Mathematica
• 2022
. We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an
We establish a reversal of Lyapunov’s inequality for monotone log-concave sequences, settling a conjecture of Havrilla-Tkocz and Melbourne-Tkocz. A strengthened version of the same conjecture is

## References

SHOWING 1-10 OF 43 REFERENCES

• Computer Science
2020 IEEE International Symposium on Information Theory (ISIT)
• 2020
A discrete analog of the Rényi entropy comparison due to Bobkov and Madiman is established for log-concave variables on the integers with the additional assumption that the variable is monotone, and a sharp bound of loge is obtained.
A comprehensive framework for deriving various EPIs for the Rényi entropy is presented that uses transport arguments from normal densities and a change of variable by rotation, and a simple transportation proof of a sharp varentropy bound is obtained.
• Computer Science
2018 IEEE International Symposium on Information Theory (ISIT)
• 2018
A Renyi entropy power inequality for log-concave random vectors when Renyi parameters belong to [0, 1] is derived using a sharp version of the reverse Young inequality and a result due to Fradelizi, Madiman, and Wang.
• Computer Science
IEEE Transactions on Information Theory
• 2019
The authors derive Rényi entropy power inequalities for log-concave random vectors when Rénery parameters belong to [0, 1] and the estimates are shown to be sharp up to absolute constants.
A refinement of the R\'enyi Entropy Power Inequality recently obtained inBM16 is presented, and a conjecture in BNT15, MMX16 in two cases is confirmed, which largely follows the approach in DCT91 of employing Young's convolution inequalities with sharp constants.
• Mathematics, Computer Science
Lecture Notes in Mathematics
• 2020
The role of convexity in Renyi entropy power inequalities is investigated and the convergence in the Central Limit Theorem for Renyi entropies of order r ∈ (0, 1) for log-concave densities and for compactly supported, spherically symmetric and unimodal densities is established.
• Mathematics
• 2012
We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convolution of two log-concave distributions in dimension d in terms of the spectral gap of the density.
• Computer Science
2015 IEEE International Symposium on Information Theory (ISIT)
• 2015
This paper shows the counterpart of this result for the Rényi entropy and the Tsallis entropy, and considers a notion of generalized mutual information, namely α-mutual information, which is defined through the Re⩽i divergence.
• Mathematics
2018 IEEE International Symposium on Information Theory (ISIT)
• 2018
The authors consider the analogous reversal of recent Renyi Entropy Power Inequalities for random vectors and again show that not only do they hold for s-concave densities, but that s- Conc Cave densities are characterized by satisfying said inequalities.