Reversal of Rényi Entropy Inequalities Under Log-Concavity

@article{Melbourne2020ReversalOR,
  title={Reversal of R{\'e}nyi Entropy Inequalities Under Log-Concavity},
  author={James Melbourne and Tomasz Tkocz},
  journal={IEEE Transactions on Information Theory},
  year={2020},
  volume={67},
  pages={45-51}
}
We establish a discrete analog of the Rényi entropy comparison due to Bobkov and Madiman. For log-concave variables on the integers, the min entropy is within <inline-formula> <tex-math notation="LaTeX">$\log e$ </tex-math></inline-formula> of the usual Shannon entropy. Additionally we investigate the entropic Rogers-Shephard inequality studied by Madiman and Kontoyannis, and establish a sharp Rényi version for certain parameters in both the continuous and discrete cases. 

Entropy-variance inequalities for discrete log-concave random variables via degree of freedom

A discrete version of the notion of degree of freedom is utilized to prove a sharp min-entropy-variance inequality for integer valued log-concave random variables and it is shown that the geometric distribution minimizes the min-Entropy within the class of log- Concave probability sequences with variance.

A discrete complement of Lyapunov's inequality and its information theoretic consequences

A reversal of Lyapunov’s inequality for monotone log-concave sequences is established, settling a conjecture of Havrilla-Tkocz and Melbourne-T kocz, and several information theoretic inequalities as consequences are derived.

Entropies of sums of independent gamma random variables

Several Schur-convexity type results under fixed variance for weighted sums of independent gamma random variables are established and nonasymptotic bounds on their Rényi entropies are obtained.

On a Conjecture of Feige for Discrete Log-Concave Distributions

A remarkable conjecture of Feige (2006) asserts that for any collection of n independent non-negative random variables X 1 , X 2 , . . . , X n , each with expectation at most 1, P ( X < E [ X ] + 1)

A Remark on discrete Brunn-Minkowski type inequalities via transportation of measure

We give an alternative proof for discrete Brunn-Minkowski type inequalities, recently obtained by Halikias, Klartag and the author. This proof also implies stronger weighted versions of these

Concentration functions and entropy bounds for discrete log-concave distributions

Two-sided bounds are explored for concentration functions and Rényi entropies in the class of discrete log-concave probability distributions. They are used to derive certain variants of the entropy

Moments, Concentration, and Entropy of Log-Concave Distributions

We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in the convex order to derive moments, concentration, and entropy inequalities for log-concave random

Inequalities for Information Potentials and Entropies

We consider a probability distribution p0(x),p1(x),… depending on a real parameter x. The associated information potential is S(x):=∑kpk2(x). The Rényi entropy and the Tsallis entropy of order 2 can

Concentration inequalities for ultra log-concave distributions

. We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an

A DISCRETE COMPLEMENT OF LYAPUNOV’S INEQUALITY AND ITS INFORMATION THEORETIC CONSEQUENCES

We establish a reversal of Lyapunov’s inequality for monotone log-concave sequences, settling a conjecture of Havrilla-Tkocz and Melbourne-Tkocz. A strengthened version of the same conjecture is

References

SHOWING 1-10 OF 43 REFERENCES

On the Rényi Entropy of Log-Concave Sequences

  • J. MelbourneT. Tkocz
  • Computer Science
    2020 IEEE International Symposium on Information Theory (ISIT)
  • 2020
A discrete analog of the Rényi entropy comparison due to Bobkov and Madiman is established for log-concave variables on the integers with the additional assumption that the variable is monotone, and a sharp bound of loge is obtained.

Rényi Entropy Power Inequalities via Normal Transport and Rotation

A comprehensive framework for deriving various EPIs for the Rényi entropy is presented that uses transport arguments from normal densities and a change of variable by rotation, and a simple transportation proof of a sharp varentropy bound is obtained.

A Renyi Entropy Power Inequality for Log-Concave Vectors and Parameters in [0, 1]

A Renyi entropy power inequality for log-concave random vectors when Renyi parameters belong to [0, 1] is derived using a sharp version of the reverse Young inequality and a result due to Fradelizi, Madiman, and Wang.

On the Entropy Power Inequality for the Rényi Entropy of Order [0, 1]

The authors derive Rényi entropy power inequalities for log-concave random vectors when Rénery parameters belong to [0, 1] and the estimates are shown to be sharp up to absolute constants.

Rényi entropy power inequality and a reverse

A refinement of the R\'enyi Entropy Power Inequality recently obtained inBM16 is presented, and a conjecture in BNT15, MMX16 in two cases is confirmed, which largely follows the approach in DCT91 of employing Young's convolution inequalities with sharp constants.

Further Investigations of Rényi Entropy Power Inequalities and an Entropic Characterization of s-Concave Densities

The role of convexity in Renyi entropy power inequalities is investigated and the convergence in the Central Limit Theorem for Renyi entropies of order r ∈ (0, 1) for log-concave densities and for compactly supported, spherically symmetric and unimodal densities is established.

Entropy jumps for isotropic log-concave random vectors and spectral gap

We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convolution of two log-concave distributions in dimension d in terms of the spectral gap of the density.

Convexity/concavity of renyi entropy and α-mutual information

  • Siu-Wai HoS. Verdú
  • Computer Science
    2015 IEEE International Symposium on Information Theory (ISIT)
  • 2015
This paper shows the counterpart of this result for the Rényi entropy and the Tsallis entropy, and considers a notion of generalized mutual information, namely α-mutual information, which is defined through the Re⩽i divergence.

Log-concavity and the maximum entropy property of the Poisson distribution

Further Investigations of the Maximum Entropy of the Sum of Two Dependent Random Variables

The authors consider the analogous reversal of recent Renyi Entropy Power Inequalities for random vectors and again show that not only do they hold for s-concave densities, but that s- Conc Cave densities are characterized by satisfying said inequalities.