Hiroki Suyari

Learn More
Tsallis entropy, one-parameter generalization of Shannon entropy, has been often discussed in statistical physics as a new information measure. This new information measure has provided many satisfactory physical interpretations in nonextensive systems exhibiting chaos or fractal. We present the generalized Shannon-Khinchin axioms to nonextensive systems(More)
In order to theoretically explain the ubiquitous existence of power-law behavior such as chaos and fractals in nature, Tsallis entropy has been successfully applied to the generalization of the traditional Boltzmann-Gibbs statistics, the fundamental information measure of which is Shannon entropy. Tsallis entropy S/sub q/ is a one-parameter generalization(More)
Based on the κ-deformed functions (κ-exponential and κ-logarithm) and associated multiplication operation (κ-product) introduced by Kaniadakis (Phys. Rev. E 66 (2002) 056125), we present another one-parameter generalization of Gauss’ law of error. The likelihood function in Gauss’ law of error is generalized by means of the κ-product. This κ-generalized(More)
The storage capacity of the perceptron with binary weights w(i)in[0,1] is derived by introducing the minimum distance d between input patterns. The approach presented in this paper is based on some results in the information theory, and the obtained storage capacity 0.585 is in good agreement with the well-known value 0.59 by the replica method in(More)
  • H. Suyari
  • 2007 IEEE International Symposium on Information…
  • 2007
We prove that the generalized Shannon additivity determines a lower bound of average description length for the q-generalized Z3-ary code tree. To clarify our main result, at first it is shown that the original Shannon additivity determines a lower bound of average code length of a Z3-ary code tree. As its generalization, we present our main result(More)
For a unified description of power-law behaviors such as chaos, fractal and scale-free network, Tsallis entropy has been applied to the generalization of the traditional Boltzmann-Gibbs statistics as a fundamental information measure. Tsallis entropy S<sub>q</sub> is an one-parameter generalization of Shannon entropy S<sub>1</sub> in the sense that(More)
In discrete systems, Shannon entropy is well known to be characterized by the Shannon-Khinchin axioms. Recently, this set of axioms was generalized for Tsallis entropy, one-parameter generalization of Shannon entropy. In continuos systems, Shannon differential entropy has been introduced as a natural extension of the above Shannon entropy without using an(More)
As is well-known in the fields of statistical mechanics, Ludwig Boltzmann clarified the concept of entropy. He considered that a macroscopic system consists of a large number of particles. Each particle is assumed to be in one of the energy levels, Ei (i = 1, . . . , k), and the number of particles in the energy level, Ei, is denoted by ni. The total number(More)
The generalized binomial distribution in Tsallis statistics (power-law system) is explicitly formulated from the precise q-Stirling’s formula. The α-divergence (or q-divergence) is uniquely derived from the generalized binomial distribution in the sense that when α → −1 (i.e., q → 1) it recovers KL divergence obtained from the standard binomial(More)
One of the promising approaches to how to derive a non-Gaussian distribution is generalizing the log-likelihood function in Gauss' law of error. In this contribution, it is shown that a generalization of the log-likelihood function in Gauss' law of error is equivalent to a generalization of the average. The proof is given for the case of the two-parameter(More)