Interpretations of Renyi Entropies And Divergences

@inproceedings{Harremoes2005InterpretationsOR,
  title={Interpretations of Renyi Entropies And Divergences},
  author={Peter Harremoes},
  year={2005}
}
Rényi divergence and majorization
TLDR
It is shown how Rényi divergence appears when the theory of majorization is generalized from the finite to the continuous setting, and plays a role in analyzing the number of binary questions required to guess the values of a sequence of random variables.
Rényi Divergence and Kullback-Leibler Divergence
TLDR
The most important properties of Rényi divergence and Kullback- Leibler divergence are reviewed, including convexity, continuity, limits of σ-algebras, and the relation of the special order 0 to the Gaussian dichotomy and contiguity.
Some generalized information theoretic measures based upon probability distributions
  • O. Parkash, C. P. Gandhi
  • Computer Science
    2010 2nd International Conference on Education Technology and Computer
  • 2010
TLDR
Based mainly on the postulate of concavity of entropy function, two new generalized measures of probabilistic entropy are introduced and investigated and studied their important and interesting properties.
On the Rényi divergence and the joint range of relative entropies
  • I. Sason
  • Mathematics
    2015 IEEE International Symposium on Information Theory (ISIT)
  • 2015
TLDR
This paper provides a geometric interpretation of the minimal Chernoff information subject to a minimal total variation distance and shows that all the points of this convex region are attained by a triple of 2-element probability distributions.
Rényi divergence measures for commonly used univariate continuous distributions
Point Divergence Gain and Multidimensional Data Sequences Analysis
TLDR
Novel information-entropic variables are introduced derived from the Rényi entropy and describe spatio-temporal changes between two consecutive discrete multidimensional distributions.
Entropy operates in non-linear semifields
TLDR
It is demonstrated that the R\'enyi entropies with parameter $\alpha$ are better thought of as operating in a type of non-linear semiring called a positive semifield, and conjecture that this is one of the reasons why tropical algebra procedures are so successful in computational intelligence applications.
Computation and Estimation of Generalized Entropy Rates for Denumerable Markov Chains
TLDR
All the entropy rates of random sequences for general entropy functionals including the classical Shannon and Rényi entropies and the more recent Tsallis and Sharma-Mittal ones are shown to be either infinite or zero except at a threshold where they are equal to Shannon or Rênyi entropy rates up to a multiplicative constant.
On the Rényi Divergence, Joint Range of Relative Entropies, and a Channel Coding Theorem
  • I. Sason
  • Computer Science
    IEEE Transactions on Information Theory
  • 2016
TLDR
An exponential upper bound is derived on the performance of binary linear block codes (or code ensembles) under maximum-likelihood decoding as a function of the deviation of their distance spectra from the binomial distribution.
Families of Alpha- Beta- and Gamma- Divergences: Flexible and Robust Measures of Similarities
TLDR
It is shown that a new wide class of Gamma-divergences can be generated not only from the family of Beta-diversgences but also from a family of Alpha-d divergences.
...
1
2
3
...