On the α-q-Mutual Information and the α-q-Capacities

  title={On the $\alpha$-q-Mutual Information and the $\alpha$-q-Capacities},
  author={Velimir M. Ilic and Ivan B. Djordjevic},
The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. All of the considerations follow the same approach, mimicking some of the various and mutually equivalent definitions of Shannon information measures, and the information transfer is quantified by… 
1 Citations

Figures from this paper

The Statistical Foundations of Entropy
During the last few decades, the notion of entropy has become omnipresent in many scientific disciplines, ranging from traditional applications in statistical physics and chemistry, information


On the Daróczy-Tsallis capacities of discrete channels
New expressions for Daroczy capacities of weakly symmetric channel, binary erasure channel and z-channel are provided, extending the previous work by Darocze, which introduced new parameterized generalization of Shannon entropy.
Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information
This paper extends the major analytical results on the saddle-point and saddle-level of the conditional relative entropy to the conditional Rényi divergence and lends further evidence to the notion that a Bayesian measure of statistical distinctness introduced by R. Sibson in 1969 is the most natural generalization.
Correlation detection and an operational interpretation of the Rényi mutual information
This work shows that the Rényi mutual information attains operational significance in the context of composite hypothesis testing, when the null hypothesis is a fixed bipartite state and the alternate hypothesis consists of all product states that share one marginal with thenull hypothesis.
Rényi entropy measure of noise-aided information transmission in a binary channel.
The results demonstrate that stochastic resonance occurs in the information channel and is registered by the Rényi entropy measures at any finite order, including the Shannon order.
Computation and Estimation of Generalized Entropy Rates for Denumerable Markov Chains
All the entropy rates of random sequences for general entropy functionals including the classical Shannon and Rényi entropies and the more recent Tsallis and Sharma-Mittal ones are shown to be either infinite or zero except at a threshold where they are equal to Shannon or Rênyi entropy rates up to a multiplicative constant.
On the Conditional Rényi Entropy
This paper reconsiders the definition for the conditional Rényi entropy of general order as proposed by Arimoto in the seventies, and shows that this particular notion satisfies several natural properties, including monotonicity under conditioning and chain rule.
The Ziv–Zakai–Rényi Bound for Joint Source-Channel Coding
Borders are derived on the Ziv-Zakai-Rényi rate-distortion function and capacity for a broad class of sources and additive noise channels, which hold for arbitrary SNR and prove the conjectured asymptotic expressions in the limit of a small distortion/high SNR.
Multiplicativity of Completely Bounded p-Norms Implies a Strong Converse for Entanglement-Assisted Capacity
The proof here demonstrates the extent to which the Arimoto approach can be helpful in proving strong converse theorems, it provides an operational relevance for the multiplicativity result of Devetak et al, and adds to the growing body of evidence that the sandwiched Rényi relative entropy is the correct quantum generalization of the classical concept for all α > 1.
A Possible Extension of Shannon's Information Theory
It is shown that the establishment of the concept of the mutual information is of importance upon the generalization of Shannon's information theory, which exhibits nonadditivity of the associated uncertainty.