Learn More
Tsallis entropy, one-parameter generalization of Shannon entropy, has been often discussed in statistical physics as a new information measure. This new information measure has provided many satisfactory physical interpretations in nonextensive systems exhibiting chaos or fractal. We present the generalized Shannon-Khinchin axioms to nonextensive systems(More)
In order to theoretically explain the ubiquitous existence of power-law behavior such as chaos and fractals in nature, Tsallis entropy has been successfully applied to the generalization of the traditional Boltzmann-Gibbs statistics, the fundamental information measure of which is Shannon entropy. Tsallis entropy S/sub q/ is a one-parameter generalization(More)
In discrete systems, Shannon entropy is well known to be characterized by the Shannon-Khinchin axioms. Recently, this set of axioms was generalized for Tsallis entropy, one-parameter generalization of Shannon entropy. In continuos systems, Shannon differential entropy has been introduced as a natural extension of the above Shannon entropy without using an(More)
Based on the κ-deformed functions (κ-exponential and κ-logarithm) and associated multiplication operation (κ-product) introduced by Kaniadakis (Phys. Rev. E 66 (2002) 056125), we present another one-parameter generalization of Gauss' law of error. The likelihood function in Gauss' law of error is generalized by means of the κ-product. This κ-generalized(More)
The storage capacity of the perceptron with binary weights w(i)in[0,1] is derived by introducing the minimum distance d between input patterns. The approach presented in this paper is based on some results in the information theory, and the obtained storage capacity 0.585 is in good agreement with the well-known value 0.59 by the replica method in(More)
For a unified description of power-law behaviors such as chaos, fractal and scale-free network, Tsallis entropy has been applied to the generalization of the traditional Boltzmann-Gibbs statistics as a fundamental information measure. Tsallis entropy S<sub>q</sub> is an one-parameter generalization of Shannon entropy S<sub>1</sub> in the sense that(More)
  • H. Suyari
  • 2007
We prove that the generalized Shannon additivity determines a lower bound of average description length for the q-generalized Z3-ary code tree. To clarify our main result, at first it is shown that the original Shannon additivity determines a lower bound of average code length of a Z3-ary code tree. As its generalization, we present our main result(More)
The quantum mutual entropy was introduced by one of the present authors in 1983 as a quantum extension of the Shannon mutual information. It has been used for several studies such as quantum information transmission in optical communication and quantum irreversible processes. In this paper, a nonlinear channel for a quantum teleportation process is(More)
One of the promising approaches to how to derive a non-Gaussian distribution is generalizing the log-likelihood function in Gauss' law of error. In this contribution, it is shown that a generalization of the log-likelihood function in Gauss' law of error is equivalent to a generalization of the average. The proof is given for the case of the two-parameter(More)