# On the α-q-Mutual Information and the α-q-Capacities

@article{Ilic2021OnT, title={On the $\alpha$-q-Mutual Information and the $\alpha$-q-Capacities}, author={Velimir M. Ilic and Ivan B. Djordjevic}, journal={Entropy}, year={2021}, volume={23} }

The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. All of the considerations follow the same approach, mimicking some of the various and mutually equivalent definitions of Shannon information measures, and the information transfer is quantified by…

## One Citation

The Statistical Foundations of Entropy

- PhysicsEntropy
- 2021

During the last few decades, the notion of entropy has become omnipresent in many scientific disciplines, ranging from traditional applications in statistical physics and chemistry, information…

## References

SHOWING 1-10 OF 61 REFERENCES

On the Daróczy-Tsallis capacities of discrete channels

- Computer Science
- 2015

New expressions for Daroczy capacities of weakly symmetric channel, binary erasure channel and z-channel are provided, extending the previous work by Darocze, which introduced new parameterized generalization of Shannon entropy.

Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information

- Computer ScienceEntropy
- 2019

This paper extends the major analytical results on the saddle-point and saddle-level of the conditional relative entropy to the conditional Rényi divergence and lends further evidence to the notion that a Bayesian measure of statistical distinctness introduced by R. Sibson in 1969 is the most natural generalization.

Correlation detection and an operational interpretation of the Rényi mutual information

- Computer Science2015 IEEE International Symposium on Information Theory (ISIT)
- 2015

This work shows that the Rényi mutual information attains operational significance in the context of composite hypothesis testing, when the null hypothesis is a fixed bipartite state and the alternate hypothesis consists of all product states that share one marginal with thenull hypothesis.

Rényi entropy measure of noise-aided information transmission in a binary channel.

- Computer SciencePhysical review. E, Statistical, nonlinear, and soft matter physics
- 2010

The results demonstrate that stochastic resonance occurs in the information channel and is registered by the Rényi entropy measures at any finite order, including the Shannon order.

Computation and Estimation of Generalized Entropy Rates for Denumerable Markov Chains

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2011

All the entropy rates of random sequences for general entropy functionals including the classical Shannon and Rényi entropies and the more recent Tsallis and Sharma-Mittal ones are shown to be either infinite or zero except at a threshold where they are equal to Shannon or Rênyi entropy rates up to a multiplicative constant.

On the Conditional Rényi Entropy

- Computer ScienceIEEE Transactions on Information Theory
- 2014

This paper reconsiders the definition for the conditional Rényi entropy of general order as proposed by Arimoto in the seventies, and shows that this particular notion satisfies several natural properties, including monotonicity under conditioning and chain rule.

The Ziv–Zakai–Rényi Bound for Joint Source-Channel Coding

- Computer ScienceIEEE Transactions on Information Theory
- 2015

Borders are derived on the Ziv-Zakai-Rényi rate-distortion function and capacity for a broad class of sources and additive noise channels, which hold for arbitrary SNR and prove the conjectured asymptotic expressions in the limit of a small distortion/high SNR.

Tsallis entropy measure of noise-aided information transmission in a binary channel

- Computer Science
- 2011

Multiplicativity of Completely Bounded p-Norms Implies a Strong Converse for Entanglement-Assisted Capacity

- Computer ScienceArXiv
- 2013

The proof here demonstrates the extent to which the Arimoto approach can be helpful in proving strong converse theorems, it provides an operational relevance for the multiplicativity result of Devetak et al, and adds to the growing body of evidence that the sandwiched Rényi relative entropy is the correct quantum generalization of the classical concept for all α > 1.

A Possible Extension of Shannon's Information Theory

- Computer ScienceEntropy
- 2001

It is shown that the establishment of the concept of the mutual information is of importance upon the generalization of Shannon's information theory, which exhibits nonadditivity of the associated uncertainty.