## 20 Citations

Rényi divergence and majorization

- Mathematics, Computer Science2010 IEEE International Symposium on Information Theory
- 2010

It is shown how Rényi divergence appears when the theory of majorization is generalized from the finite to the continuous setting, and plays a role in analyzing the number of binary questions required to guess the values of a sequence of random variables.

Rényi Divergence and Kullback-Leibler Divergence

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2014

The most important properties of Rényi divergence and Kullback- Leibler divergence are reviewed, including convexity, continuity, limits of σ-algebras, and the relation of the special order 0 to the Gaussian dichotomy and contiguity.

Some generalized information theoretic measures based upon probability distributions

- Computer Science2010 2nd International Conference on Education Technology and Computer
- 2010

Based mainly on the postulate of concavity of entropy function, two new generalized measures of probabilistic entropy are introduced and investigated and studied their important and interesting properties.

On the Rényi divergence and the joint range of relative entropies

- Mathematics, Computer Science2015 IEEE International Symposium on Information Theory (ISIT)
- 2015

This paper provides a geometric interpretation of the minimal Chernoff information subject to a minimal total variation distance and shows that all the points of this convex region are attained by a triple of 2-element probability distributions.

Point Divergence Gain and Multidimensional Data Sequences Analysis

- Computer Science, PhysicsEntropy
- 2018

Novel information-entropic variables are introduced derived from the Rényi entropy and describe spatio-temporal changes between two consecutive discrete multidimensional distributions.

Entropy operates in non-linear semifields

- Computer Science, MathematicsArXiv
- 2017

It is demonstrated that the R\'enyi entropies with parameter $\alpha$ are better thought of as operating in a type of non-linear semiring called a positive semifield, and conjecture that this is one of the reasons why tropical algebra procedures are so successful in computational intelligence applications.

Computation and Estimation of Generalized Entropy Rates for Denumerable Markov Chains

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2011

All the entropy rates of random sequences for general entropy functionals including the classical Shannon and Rényi entropies and the more recent Tsallis and Sharma-Mittal ones are shown to be either infinite or zero except at a threshold where they are equal to Shannon or Rênyi entropy rates up to a multiplicative constant.

On the Rényi Divergence, Joint Range of Relative Entropies, and a Channel Coding Theorem

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2016

An exponential upper bound is derived on the performance of binary linear block codes (or code ensembles) under maximum-likelihood decoding as a function of the deviation of their distance spectra from the binomial distribution.

Families of Alpha- Beta- and Gamma- Divergences: Flexible and Robust Measures of Similarities

- Computer Science, MathematicsEntropy
- 2010

It is shown that a new wide class of Gamma-divergences can be generated not only from the family of Beta-diversgences but also from a family of Alpha-d divergences.

On the α-q-Mutual Information and the α-q-Capacities

- Computer Science, MedicineEntropy
- 2021

It is shown that, unlike the previous definition, the α-q-mutual information and the β-capacity satisfy the set of properties, which are stated as axioms, by which they reduce to zero in the case of totally destructive channels and to the (maximal) input Sharma–Mittal entropy in the cases of perfect transmission, which is consistent with the maximum likelihood detection error.