# The Homological Nature of Entropy

@article{Baudot2015TheHN,
title={The Homological Nature of Entropy},
author={Pierre Baudot and Daniel Bennequin},
journal={Entropy},
year={2015},
volume={17},
pages={3253-3318}
}
• Published 13 May 2015
• Mathematics, Computer Science
• Entropy
We propose that entropy is a universal co-homological class in a theory associated to a family of observable quantities and a family of probability distributions. Three cases are presented: (1) classical probabilities and random variables; (2) quantum probabilities and observable operators; (3) dynamic probabilities and observation trees. This gives rise to a new kind of topology for information processes, that accounts for the main information functions: entropy, mutual-informations at all…
Information structures and their cohomology
We introduce the category of information structures, whose objects are suitable diagrams of measurable sets that encode the possible outputs of a given family of observables and their mutual
The structure of information: from probability to homology
This text serves as a detailed introduction to information cohomology, containing the necessary background in probability theory and homological algebra, and proves that information cohmology is invariant under isomorphisms of generalized structures.
A New Perspective of Entropy
• Computer Science, Mathematics
The Journal of The Math3ma Institute
• 2022
It is shown that entropy, abstract algebra, and topology are inextricably linked through a version of a well-known formula from calculus known as the Leibniz rule.
The Group Theoretic Roots of Information I: permutations, symmetry, and entropy
• D. Galas
• Mathematics, Computer Science
ArXiv
• 2019
A combinatorial measure of information and disorder is proposed, in terms of integers and discrete functions, that it is shown that the integer entropy converges uniformly to the Shannon entropy when the group includes all permutations, the Symmetric group, and the number of objects increases without bound.
A homological characterization of generalized multinomial coefficients related to the entropic chain rule
The asymptotic relationship between the multiplicative relations among multinomial coefficients and the (additive) recurrence property of Shannon entropy known as the chain rule is extended to a correspondence between certain generalized multin coefficients and any $\alpha$-entropy, that sheds new light on the meaning of the chainRule and its deformations.
Entropy as a Topological Operad Derivation
The main result is that Shannon entropy defines a derivation of the operad of topological simplices, and that for every derivations of this operad there exists a point at which it is given by a constant multiple of Shannon entropy.
A functorial characterization of von Neumann entropy
The von Neumann entropy is classified as a certain concave functor from finite-dimensional non-commutative probability spaces and state-preserving $*$-homomorphisms to real numbers and the existence of disintegrations for classical probability spaces plays a crucial role in this classification.
The Poincaré-Shannon Machine: Statistical Physics and Machine Learning Aspects of Information Cohomology
The computationally tractable subcase of simplicial information cohomology represented by entropy Hk and information Ik landscapes and their respective paths is developed, allowing investigation of Shannon’s information in the multivariate case without the assumptions of independence or of identically distributed variables.
Unifying formalism for multivariate information-related measures: Möbius operators on subset lattices
• Mathematics, Computer Science
ArXiv
• 2016
Information-related measures are important tools of multi-variable data analysis, as measures of dependence among variables and the description of order in biological and physical systems.
Foliations-Webs-Hessian Geometry-Information Geometry-Entropy and Cohomology
The theory of homology of Koszul-Vinberg algebroids and their modules (KV homology in short) is used for investigating links between differential information geometry and differential topology and the links between the classical theory of models, the new theory and Vanishing Theorems in the theory of Homological statistical models.

## References

SHOWING 1-10 OF 68 REFERENCES
Topological forms of information
• Mathematics
• 2015
We propose that entropy is a universal co-homological class in a theory associated to a family of observable quantities and a family of probability distributions. Three cases are presented: 1)
A Characterization of Entropy in Terms of Information Loss
• Computer Science
Entropy
• 2011
It is shown that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous and naturally generalizes to Tsallis entropy as well.
In a Search for a Structure, Part 1: On Entropy
Mathematics is about ”interesting structures”. What make a structure interesting is an abundance of interesting problems; we study a structure by solving these problems. The worlds of science, as
TRIPLES, ALGEBRAS AND COHOMOLOGY
It is with great pleasure that the editors of Theory and Applications of Categories make this dissertation generally available. Although the date on the thesis is 1967, there was a nearly complete
Elements of Information Theory
• Computer Science
• 1991
The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Towards a Group-Theoretical Interpretation of Mechanics
We argue that the classical description of a symplectic manifold endowed with a Hamiltonian action of an abelian Lie group G and the corresponding quantum theory can be understood as different