Reversal of Rényi Entropy Inequalities Under Log-Concavity
@article{Melbourne2020ReversalOR, title={Reversal of R{\'e}nyi Entropy Inequalities Under Log-Concavity}, author={James Melbourne and Tomasz Tkocz}, journal={IEEE Transactions on Information Theory}, year={2020}, volume={67}, pages={45-51} }
We establish a discrete analog of the Rényi entropy comparison due to Bobkov and Madiman. For log-concave variables on the integers, the min entropy is within <inline-formula> <tex-math notation="LaTeX">$\log e$ </tex-math></inline-formula> of the usual Shannon entropy. Additionally we investigate the entropic Rogers-Shephard inequality studied by Madiman and Kontoyannis, and establish a sharp Rényi version for certain parameters in both the continuous and discrete cases.
12 Citations
Entropy-variance inequalities for discrete log-concave random variables via degree of freedom
- Computer Science, Mathematics
- 2022
A discrete version of the notion of degree of freedom is utilized to prove a sharp min-entropy-variance inequality for integer valued log-concave random variables and it is shown that the geometric distribution minimizes the min-Entropy within the class of log- Concave probability sequences with variance.
A discrete complement of Lyapunov's inequality and its information theoretic consequences
- MathematicsArXiv
- 2021
A reversal of Lyapunov’s inequality for monotone log-concave sequences is established, settling a conjecture of Havrilla-Tkocz and Melbourne-T kocz, and several information theoretic inequalities as consequences are derived.
Entropies of sums of independent gamma random variables
- MathematicsJournal of Theoretical Probability
- 2022
Several Schur-convexity type results under fixed variance for weighted sums of independent gamma random variables are established and nonasymptotic bounds on their Rényi entropies are obtained.
On a Conjecture of Feige for Discrete Log-Concave Distributions
- Mathematics
- 2022
A remarkable conjecture of Feige (2006) asserts that for any collection of n independent non-negative random variables X 1 , X 2 , . . . , X n , each with expectation at most 1, P ( X < E [ X ] + 1)…
A Remark on discrete Brunn-Minkowski type inequalities via transportation of measure
- Mathematics
- 2020
We give an alternative proof for discrete Brunn-Minkowski type inequalities, recently obtained by Halikias, Klartag and the author. This proof also implies stronger weighted versions of these…
Concentration functions and entropy bounds for discrete log-concave distributions
- Computer ScienceComb. Probab. Comput.
- 2022
Two-sided bounds are explored for concentration functions and Rényi entropies in the class of discrete log-concave probability distributions. They are used to derive certain variants of the entropy…
Moments, Concentration, and Entropy of Log-Concave Distributions
- EconomicsArXiv
- 2022
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in the convex order to derive moments, concentration, and entropy inequalities for log-concave random…
Inequalities for Information Potentials and Entropies
- MathematicsMathematics
- 2020
We consider a probability distribution p0(x),p1(x),… depending on a real parameter x. The associated information potential is S(x):=∑kpk2(x). The Rényi entropy and the Tsallis entropy of order 2 can…
Concentration inequalities for ultra log-concave distributions
- MathematicsStudia Mathematica
- 2022
. We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an…
A DISCRETE COMPLEMENT OF LYAPUNOV’S INEQUALITY AND ITS INFORMATION THEORETIC CONSEQUENCES
- Mathematics
- 2022
We establish a reversal of Lyapunov’s inequality for monotone log-concave sequences, settling a conjecture of Havrilla-Tkocz and Melbourne-Tkocz. A strengthened version of the same conjecture is…
References
SHOWING 1-10 OF 43 REFERENCES
On the Rényi Entropy of Log-Concave Sequences
- Computer Science2020 IEEE International Symposium on Information Theory (ISIT)
- 2020
A discrete analog of the Rényi entropy comparison due to Bobkov and Madiman is established for log-concave variables on the integers with the additional assumption that the variable is monotone, and a sharp bound of loge is obtained.
Rényi Entropy Power Inequalities via Normal Transport and Rotation
- Computer ScienceEntropy
- 2018
A comprehensive framework for deriving various EPIs for the Rényi entropy is presented that uses transport arguments from normal densities and a change of variable by rotation, and a simple transportation proof of a sharp varentropy bound is obtained.
A Renyi Entropy Power Inequality for Log-Concave Vectors and Parameters in [0, 1]
- Computer Science2018 IEEE International Symposium on Information Theory (ISIT)
- 2018
A Renyi entropy power inequality for log-concave random vectors when Renyi parameters belong to [0, 1] is derived using a sharp version of the reverse Young inequality and a result due to Fradelizi, Madiman, and Wang.
On the Entropy Power Inequality for the Rényi Entropy of Order [0, 1]
- Computer ScienceIEEE Transactions on Information Theory
- 2019
The authors derive Rényi entropy power inequalities for log-concave random vectors when Rénery parameters belong to [0, 1] and the estimates are shown to be sharp up to absolute constants.
Rényi entropy power inequality and a reverse
- Computer ScienceArXiv
- 2017
A refinement of the R\'enyi Entropy Power Inequality recently obtained inBM16 is presented, and a conjecture in BNT15, MMX16 in two cases is confirmed, which largely follows the approach in DCT91 of employing Young's convolution inequalities with sharp constants.
Further Investigations of Rényi Entropy Power Inequalities and an Entropic Characterization of s-Concave Densities
- Mathematics, Computer ScienceLecture Notes in Mathematics
- 2020
The role of convexity in Renyi entropy power inequalities is investigated and the convergence in the Central Limit Theorem for Renyi entropies of order r ∈ (0, 1) for log-concave densities and for compactly supported, spherically symmetric and unimodal densities is established.
Entropy jumps for isotropic log-concave random vectors and spectral gap
- Mathematics
- 2012
We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convolution of two log-concave distributions in dimension d in terms of the spectral gap of the density.…
Convexity/concavity of renyi entropy and α-mutual information
- Computer Science2015 IEEE International Symposium on Information Theory (ISIT)
- 2015
This paper shows the counterpart of this result for the Rényi entropy and the Tsallis entropy, and considers a notion of generalized mutual information, namely α-mutual information, which is defined through the Re⩽i divergence.
Further Investigations of the Maximum Entropy of the Sum of Two Dependent Random Variables
- Mathematics2018 IEEE International Symposium on Information Theory (ISIT)
- 2018
The authors consider the analogous reversal of recent Renyi Entropy Power Inequalities for random vectors and again show that not only do they hold for s-concave densities, but that s- Conc Cave densities are characterized by satisfying said inequalities.