Beyond the Shannon-Khinchin Formulation: The Composability Axiom and the Universal Group Entropy

@article{Tempesta2014BeyondTS,
  title={Beyond the Shannon-Khinchin Formulation: The Composability Axiom and the Universal Group Entropy},
  author={Piergiulio Tempesta},
  journal={arXiv: Mathematical Physics},
  year={2014}
}
  • P. Tempesta
  • Published 14 July 2014
  • Computer Science
  • arXiv: Mathematical Physics

Formal groups and Z-entropies

  • P. Tempesta
  • Computer Science
    Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
  • 2016
It is proved that the celebrated Rényi entropy is the first example of a new family of infinitely many multi-parametric entropies called the Z-entropies, which arise as new entropic functions possessing the mathematical properties necessary for information-theoretical applications, in both classical and quantum contexts.

Group Entropies: From Phase Space Geometry to Entropy Functionals via Group Theory

The group theoretic entropies make use of formal group theory to replace the additivity axiom with a more general composability axiom, and explain why groupEntropies may be particularly relevant from an information-theoretical perspective.

A theorem on the existence of trace-form generalized entropies

  • P. Tempesta
  • Computer Science
    Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
  • 2015
An analytic technique is proposed, which allows to generate many new examples of entropic functionals generalizing the standard Boltzmann–Gibbs entropy. Our approach is based on the existence of a

Uniqueness and characterization theorems for generalized entropies

It is proved that, under mild regularity assumptions, the only composable generalized entropy in trace form is the Tsallis one-parameter family (which contains Boltzmann–Gibbs as a particular case), which leads to the use of generalized entropies that are not of trace form in the study of complex systems.

A System of Billiard and Its Application to Information-Theoretic Entropy

The Ihara entropy is a weakly decomposable entropy whose composition law is given by the Lazard formal group law and fulfils the generalized Shannon-Khinchin axioms.

Beyond Boltzmann–Gibbs–Shannon in Physics and Elsewhere

The present review focuses on nonadditive entropies generalizing Boltzmann–Gibbs statistical mechanics and their predictions, verifications, and applications in physics and elsewhere.

Change the coefficients of conditional entropies in extensivity

The impossibility to replace the coefficients with a non-power function of the probabilities of the events $X=n$ is proved and the difference between the value at the joint law of $(X,Y) and that at the law of $X$ for a general functional is estimated.

A family of generalized quantum entropies: definition and properties

A quantum version of the generalized generalized entropies, introduced by Salicrú et al, is presented and it is exhibited that majorization plays a key role in explaining most of their common features.

On the equivalence between four versions of thermostatistics based on strongly pseudo-additive entropies

This paper establishes the equivalence between four different thermostatistics formalisms based on Renyi and SPA entropies coupled with linear and escort constraints and provides the transformation formulas, which obtain a general framework applicable to the wide class of entropie and constraints previously discussed in the literature.

References

SHOWING 1-10 OF 61 REFERENCES

Group entropies, correlation laws, and zeta functions.

  • P. Tempesta
  • Computer Science
    Physical review. E, Statistical, nonlinear, and soft matter physics
  • 2011
The notion of group entropy enables the unification and generaliztion of many different definitions of entropy known in the literature, such as those of Boltzmann-Gibbs, Tsallis, Abe, and Kaniadakis, and generalizations of the Kullback-Leibler divergence are proposed.

A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions

Motivated by the hope that the thermodynamical framework might be extended to strongly interacting statistical systems —complex systems in particular— a number of generalized entropies has been

Generalized entropies and logarithms and their duality relations

It is shown that this duality fixes a unique escort probability, which allows us to derive a complete theory of the generalized logarithms that naturally arise from the violation of this axiom, and how the functional forms of these generalized logs are related to the asymptotic scaling behavior of the entropy.

A theorem on the existence of trace-form generalized entropies

  • P. Tempesta
  • Computer Science
    Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
  • 2015
An analytic technique is proposed, which allows to generate many new examples of entropic functionals generalizing the standard Boltzmann–Gibbs entropy. Our approach is based on the existence of a

Maximum entropy principle and power-law tailed distributions

The question is reconsidered if and how is it possible to select generalized statistical theories in which the above mentioned twofold link between entropy and the distribution function continues to hold, such as in the case of ordinary statistical mechanics.

Black hole thermodynamical entropy

As early as 1902, Gibbs pointed out that systems whose partition function diverges, e.g. gravitation, lie outside the validity of the Boltzmann–Gibbs (BG) theory. Consistently, since the pioneering

How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems

This paper proves that a MEP indeed exists for complex systems and derive the generalized entropy, and finds that it belongs to the class of the recently proposed (c,d)-entropies, and shows that path-dependent random processes with memory naturally require specific generalized entropies.

Generalized entropy optimized by a given arbitrary distribution

An ultimate generalization of the maximum entropy principle is presented. An entropic measure, which is optimized by a given arbitrary distribution with the finite linear expectation value of a

Lambert function and a new non-extensive form of entropy

We propose a new way of defining entropy of a system, which gives a general form that is non-extensive like Tsallis entropy, but is linearly dependent on component entropies, like Renyi entropy,

Statistical mechanics in the context of special relativity. II.

  • Giorgio Kaniadakis
  • Physics
    Physical review. E, Statistical, nonlinear, and soft matter physics
  • 2005
It is shown that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy, allowing a coherent and self-consistent relativistic statistical theory to be constructed, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit.
...