Rényi entropy

Known as: Rényi's divergence, Collision entropy, Rényi's entropy 
In information theory, the Rényi entropy generalizes the Hartley entropy, the Shannon entropy, the collision entropy and the min entropy. Entropies… (More)
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2015
2015
The classical entropy power inequality is extended to the Rényi entropy. We also discuss the question of the existence of the… (More)
Is this relevant?
2014
2014
The Rényi entropy of general order unifies the well-known Shannon entropy with several other entropy notions, like the min… (More)
Is this relevant?
Highly Cited
2013
Highly Cited
2013
We show that a recent definition of relative Rényi entropy is monotone under completely positive, trace preserving maps. This… (More)
Is this relevant?
Highly Cited
2010
Highly Cited
2010
We present simple and computationally efficient nonparametric estimators of Rényi entropy and mutual information based on an i.i… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
Is this relevant?
2006
2006
Sample entropy and approximate entropy are measures that have been successfully utilized to study the deterministic dynamics of… (More)
  • figure 1
  • figure 3
  • figure 2
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
2004
Highly Cited
2004
We introduce a new entropy measure, called smooth Renyi entropy. The measure characterizes fundamental properties of a random… (More)
Is this relevant?
Highly Cited
2004
Highly Cited
2004
In this paper, we present a new thresholding technique based on two-dimensional Renyi’s entropy. The two-dimensional Renyi’s… (More)
  • figure 1
  • table 1
  • figure 2
  • figure 3
  • figure 4
Is this relevant?
Highly Cited
2004
Highly Cited
2004
The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must… (More)
Is this relevant?
2004
2004
For a large class of stationary probability measures on A/sup N/, where A is a finite alphabet, we compute the specific Renyi… (More)
  • table I
  • table II
Is this relevant?
Highly Cited
1965
Highly Cited
1965
Let L(t) = t 1 log D (~piDt~'~), where p~ is the probability of the ith input symbol to a noiseless channel, and nl is the length… (More)
Is this relevant?