• Corpus ID: 12697438

Sharp Bounds Between Two Rényi Entropies of Distinct Positive Orders

@article{Sakai2016SharpBB,
  title={Sharp Bounds Between Two R{\'e}nyi Entropies of Distinct Positive Orders},
  author={Yuta Sakai and Ken-ichi Iwata},
  journal={ArXiv},
  year={2016},
  volume={abs/1605.00019}
}
Many axiomatic definitions of entropy, such as the Renyi entropy, of a random variable are closely related to the $\ell_{\alpha}$-norm of its probability distribution. This study considers probability distributions on finite sets, and examines the sharp bounds of the $\ell_{\beta}$-norm with a fixed $\ell_{\alpha}$-norm, $\alpha \neq \beta$, for $n$-dimensional probability vectors with an integer $n \ge 2$. From the results, we derive the sharp bounds of the Renyi entropy of positive order… 
1 Citations

Figures from this paper

Sharp bounds on Arimoto's conditional Rényi entropies between two distinct orders
  • Yuta Sakai, K. Iwata
  • Computer Science
    2017 IEEE International Symposium on Information Theory (ISIT)
  • 2017
TLDR
This study examines sharp bounds on Arimoto's conditional Rényi entropy of order β with a fixed another one of distinct order α ≠ β and identifies specific probability distributions which achieve sharp bounds.

References

SHOWING 1-10 OF 34 REFERENCES
Relations between conditional Shannon entropy and expectation of ℓα-norm
  • Yuta Sakai, K. Iwata
  • Computer Science
    2016 IEEE International Symposium on Information Theory (ISIT)
  • 2016
TLDR
The paper derives the sharp bounds of the expectation of ℓ<sub>α</sub>-norm with a fixed conditional Shannon entropy, and vice versa, and applies these results to discrete memoryless channels under a uniform input distribution.
On the Conditional Rényi Entropy
TLDR
This paper reconsiders the definition for the conditional Rényi entropy of general order as proposed by Arimoto in the seventies, and shows that this particular notion satisfies several natural properties, including monotonicity under conditioning and chain rule.
Extremal relations between shannon entropy and ℓα-norm
  • Yuta Sakai, K. Iwata
  • Computer Science
    2016 International Symposium on Information Theory and Its Applications (ISITA)
  • 2016
TLDR
The paper derives the sharp bounds between the Shannon entropy and several information measures which are determined by the ℓ<inf>α</inf>-norm, and applies these results to uniformly focusing channels.
Relations between entropy and error probability
TLDR
It is concluded that for R /spl ges/ C the equivocation achieves its minimal value of R - C at the rate of n/sup - 1/2//, where n is the block length.
Elements of Information Theory
TLDR
The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
On Measures of Entropy and Information
TLDR
5 Euclidian metrics 5 Manhattan distance 7 Manhattan distance 5 Euclidian distance 5 Minkowski distance 6 Symmetric Csiszár f-divergences 6 CsisZárf-Divergence 6 Total variational distance.
Inequalities between entropy and index of coincidence derived from information diagrams
TLDR
The main result of the paper is the determination of the precise range of the map P/spl rarr/(IC(P), H(P)), which gives rise to precise lower as well as upper bounds for the entropy function.
Extremality for Gallager’s Reliability Function $E_{0}$
  • Mine Alsan
  • Computer Science
    IEEE Transactions on Information Theory
  • 2015
TLDR
The results characterize the extremality of the E<sub>0</sub>(ρ) curves of the binary erasure channel and the binary symmetric channel among all the E-sub 0-sub curves that can be generated by the class of binary discrete memoryless channels whose E-Sub curves pass through a given point.
Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I
TLDR
The paper is presented in two parts: the first, appearing here, summarizes the major results and treats the case of high transmission rates in detail; the second, to appear in the subsequent issue, treats the cases of low transmission rates.
...
...