# On the α-q-Mutual Information and the α-q-Capacities

@article{Ilic2021OnT,
title={On the $\alpha$-q-Mutual Information and the $\alpha$-q-Capacities},
author={Velimir M. Ilic and Ivan B. Djordjevic},
journal={Entropy},
year={2021},
volume={23}
}
• Published 1 June 2021
• Computer Science
• Entropy
The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. All of the considerations follow the same approach, mimicking some of the various and mutually equivalent definitions of Shannon information measures, and the information transfer is quantified by…
1 Citations

## Figures from this paper

The Statistical Foundations of Entropy
• Physics
Entropy
• 2021
During the last few decades, the notion of entropy has become omnipresent in many scientific disciplines, ranging from traditional applications in statistical physics and chemistry, information

## References

SHOWING 1-10 OF 61 REFERENCES
On the Daróczy-Tsallis capacities of discrete channels
• Computer Science
• 2015
New expressions for Daroczy capacities of weakly symmetric channel, binary erasure channel and z-channel are provided, extending the previous work by Darocze, which introduced new parameterized generalization of Shannon entropy.
Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information
• Computer Science
Entropy
• 2019
This paper extends the major analytical results on the saddle-point and saddle-level of the conditional relative entropy to the conditional Rényi divergence and lends further evidence to the notion that a Bayesian measure of statistical distinctness introduced by R. Sibson in 1969 is the most natural generalization.
Correlation detection and an operational interpretation of the Rényi mutual information
• Computer Science
2015 IEEE International Symposium on Information Theory (ISIT)
• 2015
This work shows that the Rényi mutual information attains operational significance in the context of composite hypothesis testing, when the null hypothesis is a fixed bipartite state and the alternate hypothesis consists of all product states that share one marginal with thenull hypothesis.
Rényi entropy measure of noise-aided information transmission in a binary channel.
• Computer Science
Physical review. E, Statistical, nonlinear, and soft matter physics
• 2010
The results demonstrate that stochastic resonance occurs in the information channel and is registered by the Rényi entropy measures at any finite order, including the Shannon order.
Computation and Estimation of Generalized Entropy Rates for Denumerable Markov Chains
• Mathematics, Computer Science
IEEE Transactions on Information Theory
• 2011
All the entropy rates of random sequences for general entropy functionals including the classical Shannon and Rényi entropies and the more recent Tsallis and Sharma-Mittal ones are shown to be either infinite or zero except at a threshold where they are equal to Shannon or Rênyi entropy rates up to a multiplicative constant.
On the Conditional Rényi Entropy
• Computer Science
IEEE Transactions on Information Theory
• 2014
This paper reconsiders the definition for the conditional Rényi entropy of general order as proposed by Arimoto in the seventies, and shows that this particular notion satisfies several natural properties, including monotonicity under conditioning and chain rule.
The Ziv–Zakai–Rényi Bound for Joint Source-Channel Coding
• Computer Science
IEEE Transactions on Information Theory
• 2015
Borders are derived on the Ziv-Zakai-Rényi rate-distortion function and capacity for a broad class of sources and additive noise channels, which hold for arbitrary SNR and prove the conjectured asymptotic expressions in the limit of a small distortion/high SNR.
Multiplicativity of Completely Bounded p-Norms Implies a Strong Converse for Entanglement-Assisted Capacity
• Computer Science
ArXiv
• 2013
The proof here demonstrates the extent to which the Arimoto approach can be helpful in proving strong converse theorems, it provides an operational relevance for the multiplicativity result of Devetak et al, and adds to the growing body of evidence that the sandwiched Rényi relative entropy is the correct quantum generalization of the classical concept for all α > 1.
A Possible Extension of Shannon's Information Theory
It is shown that the establishment of the concept of the mutual information is of importance upon the generalization of Shannon's information theory, which exhibits nonadditivity of the associated uncertainty.