Topological Information Data Analysis

@article{Baudot2019TopologicalID,
  title={Topological Information Data Analysis},
  author={Pierre Baudot and M{\'o}nica Tapia and Daniel Bennequin and Jean-Marc Goaillard},
  journal={Entropy},
  year={2019},
  volume={21}
}
This paper presents methods that quantify the structure of statistical interactions within a given data set, and were applied in a previous article. It establishes new results on the k-multivariate mutual-information (Ik) inspired by the topological formulation of Information introduced in a serie of studies. In particular, we show that the vanishing of all Ik for 2≤k≤n of n random variables is equivalent to their statistical independence. Pursuing the work of Hu Kuo Ting and Te Sun Han, we… 
The Poincaré-Shannon Machine: Statistical Physics and Machine Learning Aspects of Information Cohomology
TLDR
The computationally tractable subcase of simplicial information cohomology represented by entropy Hk and information Ik landscapes and their respective paths is developed, allowing investigation of Shannon’s information in the multivariate case without the assumptions of independence or of identically distributed variables.
The Poincaré-Boltzmann Machine: from Statistical Physics to Machine Learning and back
TLDR
The computational methods of information cohomology applied to genetic expression are presented in and in the companion paper and its interpretations in terms of statistical physics and machine learning are proposed.
Message-Passing Algorithms and Homology.
This PhD thesis lays out algebraic and topological structures relevant for the study of probabilistic graphical models. Marginal estimation algorithms are introduced as diffusion equations of the
A Path-Based Partial Information Decomposition
TLDR
A novel mutual information measure that gives rise to intuitive, non-negative, and additive path-based information components—redundant, unique, and synergistic information—as proposed by Williams and Beer is proposed.
Alleviating the independence assumptions of averaged one-dependence estimators by model weighting
TLDR
A novel weighted AODE algorithm, called AWODE, is proposed that adaptively selects weights to alleviate the independence assumption and make the learned probability distribution fit the instance and achieves bias-variance trade-off.
The Reciprocal Influence Criterion: An Upgrade of the Information Quality Ratio
TLDR
The reciprocal influence criterion is proposed, which is very simple conceptually and does not make any assumption about the statistics of the stochastic variables involved, and provides a much better resilience to noise and much higher stability to the issues related to the determination of the involved probability distribution functions.
Characterizing stochastic time series with ordinal networks.
TLDR
It is argued that ordinal networks can detect sudden changes in Earth's seismic activity caused by large earthquakes and be used for estimating the Hurst exponent of time series with accuracy comparable with state-of-the-art methods.
A hands-on tutorial on network and topological neuroscience
TLDR
This goal is to provide the computational tools to explore neuroimaging data using topological data analysis and graph theory frameworks and to facilitate their accessibility, data visualisation, and comprehension for newcomers to the field.
Homotopy Theoretic and Categorical Models of Neural Information Networks
TLDR
A novel mathematical formalism for the modeling of neural information networks endowed with additional structure in the form of assignments of resources, either computational or metabolic or informational, is developed.
Elements of qualitative cognition: An information topology perspective.
  • P. Baudot
  • Computer Science
    Physics of life reviews
  • 2019
...
1
2
3
...

References

SHOWING 1-10 OF 102 REFERENCES
Topological Information Data Analysis: Poincare-Shannon Machine and Statistical Physic of Finite Heterogeneous Systems
TLDR
Methods that quantify the structure of statistical interactions within a given data set using the characterization of information theory in cohomology by finite methods are established and their expression in terms of statistical physic and machine learning is provided.
The Poincaré-Boltzmann Machine: from Statistical Physics to Machine Learning and back
TLDR
The computational methods of information cohomology applied to genetic expression are presented in and in the companion paper and its interpretations in terms of statistical physics and machine learning are proposed.
Nonnegative Decomposition of Multivariate Information
TLDR
This work reconsider from first principles the general structure of the information that a set of sources provides about a given variable and proposes a definition of partial information atoms that exhaustively decompose the Shannon information in a multivariate system in terms of the redundancy between synergies of subsets of the sources.
Topological data analysis
Inverse problems can be defined as the area of mathematics that attempts to reconstruct a physical or mathematical object from derived data. Frequently, this means the evaluation of parameters or
THE CO-INFORMATION LATTICE
TLDR
The co-information lattice sheds light on the problem of approximating a joint density with a set of marginal densities, though as usual the authors run into the partition function.
The Homological Nature of Entropy
TLDR
It is proposed that entropy is a universal co-homological class in a theory associated to a family of observable quantities and aFamily of probability distributions, that gives rise to a new kind of topology for information processes, that accounts for the main information functions.
predictive information , multi-information , and binding information
We introduce an information theoretic measure of dependency between multiple random variables, called ‘binding information’ and compare it with several previously proposed measures of statistical
Information Theoretical Analysis of Multivariate Correlation
TLDR
The present paper gives various theorems, according to which Ctot(λ) can be decomposed in terms of the partial correlations existing in subsets of λ, and of quantities derivable therefrom.
On the Sufficiency of Pairwise Interactions in Maximum Entropy Models of Networks
TLDR
It is argued that this reduction in complexity can be thought of as a natural property of densely interacting networks in certain regimes, and not necessarily as a special property of living systems.
...
1
2
3
4
5
...