Learn More
Tsallis entropy, one-parameter generalization of Shannon entropy, has been often discussed in statistical physics as a new information measure. This new information measure has provided many satisfactory physical interpretations in nonextensive systems exhibiting chaos or fractal. We present the generalized Shannon-Khinchin axioms to nonextensive systems(More)
In order to theoretically explain the ubiquitous existence of power-law behavior such as chaos and fractals in nature, Tsallis entropy has been successfully applied to the generalization of the traditional Boltzmann-Gibbs statistics, the fundamental information measure of which is Shannon entropy. Tsallis entropy S/sub q/ is a one-parameter generalization(More)
In discrete systems, Shannon entropy is well known to be characterized by the Shannon-Khinchin axioms. Recently, this set of axioms was generalized for Tsallis entropy, one-parameter generalization of Shannon entropy. In continuos systems, Shannon differential entropy has been introduced as a natural extension of the above Shannon entropy without using an(More)
Based on the κ-deformed functions (κ-exponential and κ-logarithm) and associated multiplication operation (κ-product) introduced by Kaniadakis (Phys. Rev. E 66 (2002) 056125), we present another one-parameter generalization of Gauss' law of error. The likelihood function in Gauss' law of error is generalized by means of the κ-product. This κ-generalized(More)
For a unified description of power-law behaviors such as chaos, fractal and scale-free network, Tsallis entropy has been applied to the generalization of the traditional Boltzmann-Gibbs statistics as a fundamental information measure. Tsallis entropy S<sub>q</sub> is an one-parameter generalization of Shannon entropy S<sub>1</sub> in the sense that(More)
  • H. Suyari
  • 2007
We prove that the generalized Shannon additivity determines a lower bound of average description length for the q-generalized Z3-ary code tree. To clarify our main result, at first it is shown that the original Shannon additivity determines a lower bound of average code length of a Z3-ary code tree. As its generalization, we present our main result(More)
The quantum mutual entropy was introduced by one of the present authors in 1983 as a quantum extension of the Shannon mutual information. It has been used for several studies such as quantum information transmission in optical communication and quantum irreversible processes. In this paper, a nonlinear channel for a quantum teleportation process is(More)
—The generalized binomial distribution in Tsallis statistics (power-law system) is explicitly formulated from the precise q-Stirling's formula. The α-divergence (or q-divergence) is uniquely derived from the generalized binomial distribution in the sense that when α → −1 (i.e., q → 1) it recovers KL divergence obtained from the standard binomial(More)
The generalized binomial distribution in Tsallis statistics (power-law system) is explicitly formulated from the precise q-Stirling's formula. The &#x03B1;-divergence (or q-divergence) is uniquely derived from the generalized binomial distribution in the sense that when a &#x03B1; &#x2192; -1 (i.e., q &#x2192; 1) it recovers KL divergence obtained from the(More)