Information geometry on hierarchy of probability distributions

@article{Amari2001InformationGO,
  title={Information geometry on hierarchy of probability distributions},
  author={S. Amari},
  journal={IEEE Trans. Inf. Theory},
  year={2001},
  volume={47},
  pages={1701-1711}
}
  • S. Amari
  • Published 2001
  • Mathematics, Computer Science
  • IEEE Trans. Inf. Theory
An exponential family or mixture family of probability distributions has a natural hierarchical structure. This paper gives an "orthogonal" decomposition of such a system based on information geometry. A typical example is the decomposition of stochastic dependency among a number of random variables. In general, they have a complex structure of dependencies. Pairwise dependency is easily represented by correlation, but it is more difficult to measure effects of pure triplewise or higher order… Expand
Structure Analysis of a Probabilistic Network in an Information Geometric Framework
  • T. Zheng, C. Guest
  • Mathematics, Computer Science
  • The 2006 IEEE International Joint Conference on Neural Network Proceedings
  • 2006
TLDR
By adopting a generalized definition of mutual information derived from information geometry, one can, in theory, extend Chow's method of constructing trees based on pair-wise mutual information to an arbitrary clique size. Expand
Hierarchical Quantification of Synergy in Channels
TLDR
A new point of view on the decomposition of channel information into synergies of different order is proposed, which model a multi-input channel as a Markov kernel and can be easily evaluated by an iterative scaling algorithm. Expand
Notes on information geometry and evolutionary processes
TLDR
In these notes I briefly review the essentials of various coordinate bases and of information geometry to give an overview and make the approaches comparable. Expand
Orthogonal decompositions of multivariate statistical dependence measures
  • I. Goodman, Don H. Johnson
  • Mathematics, Computer Science
  • 2004 IEEE International Conference on Acoustics, Speech, and Signal Processing
  • 2004
We describe two multivariate statistical dependence measures which can be orthogonally decomposed to separate the effects of pairwise, triplewise, and higher order interactions between the randomExpand
Information geometry, simulation and complexity in Gaussian random fields
  • A. Levada
  • Mathematics, Computer Science
  • Monte Carlo Methods Appl.
  • 2016
TLDR
This paper proposes to quantify how changes in the spatial dependence structure affect the Riemannian metric tensor that equips the model's parametric space and defines Fisher curves, which resemble mathematical models of hysteresis in which the natural orientation is pointed by an arrow of time. Expand
Information Geometry of Multiple Spike Trains
Information geometry studies a family probability distributions by using modern geometry. Since a stochastic model of multiple spike trains is described by a family of probability distributions,Expand
Extension of information geometry for modelling non-statistical systems
In this dissertation, an abstract formalism extending information geometry is introduced. This framework encompasses a broad range of modelling problems, including possible applications in machineExpand
Finite Information Geometry
This chapter investigates probability distributions on a finite sample space and takes advantage of the more elementary nature of this setting. There are two complementary ways to view a probabilityExpand
Information-theoretic inference of common ancestors
TLDR
This work proves an information-theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system and shows that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Expand
Fields of Application of Information Geometry
1. Complexity measures can be geometrically built by using the information distance (Kullback–Leibler divergence) from families with restricted statistical dependencies. The Pythagorean geometryExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
An Infinite-Dimensional Geometric Structure on the Space of all the Probability Measures Equivalent to a Given One
Let M μ be the set of all probability densities equivalent to a given reference probability measure μ. This set is thought of as the maximal regular (i.e., with strictly positive densities)Expand
Nonnegative Entropy Measures of Multivariate Symmetric Correlations
  • T. Han
  • Computer Science, Mathematics
  • Inf. Control.
  • 1978
TLDR
A “hierarchical structure” of probabilistic dependence relations is proposed where it is shown that any symmetric correlation associated with a nonnegative entropy is decomposed into pairwise conditional and/or nonconditional correlations. Expand
Statistical inference under multiterminal rate restrictions: A differential geometric approach
  • S. Amari, T. Han
  • Mathematics, Computer Science
  • IEEE Trans. Inf. Theory
  • 1989
TLDR
It is shown that the differential geometry of the manifold of all probability distributions plays a fundamental role in this type of multiterminal problem connecting Shannon information and statistical information. Expand
Dualistic geometry of the manifold of higher-order neurons
  • S. Amari
  • Mathematics, Computer Science
  • Neural Networks
  • 1991
TLDR
An information geometrical method, which can be applied to more general neural network manifolds, is proposed and the accuracy of statistical estimation is shown in terms of the dimensionality of a model and the number of examples. Expand
Information geometry of Boltzmann machines
TLDR
Using the new theory of information geometry, a natural invariant Riemannian metric and a dual pair of affine connections on the Boltzmann neural network manifold are established and the meaning of geometrical structures is elucidated from the stochastic and the statistical point of view. Expand
The relation between information theory and the differential geometry approach to statistics
TLDR
It is shown that the Riemannian metric on the probability simplex ∑xi = 1 defined by (ds) 2 = ∑(dx i ) 2 x i has an invariance property under certain probabilistically natural mappings. Expand
$I$-Divergence Geometry of Probability Distributions and Minimization Problems
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology andExpand
On the rationale of maximum-entropy methods
  • E. Jaynes
  • Mathematics
  • Proceedings of the IEEE
  • 1982
We discuss the relations between maximum-entropy (MAXENT) and other methods of spectral analysis such as the Schuster, Blackman-Tukey, maximum-likelihood, Bayesian, and Autoregressive (AR, ARMA, orExpand
Fisher information under restriction of Shannon information in multi-terminal situations
Fisher information generally decreases by summarizing observed data into encoded messages. The present paper studies the amount of Fisher information included in independently summarized messagesExpand
Information geometry of the EM and em algorithms for neural networks
  • S. Amari
  • Mathematics, Computer Science
  • Neural Networks
  • 1995
TLDR
A unified information geometrical framework for studying stochastic models of neural networks, by focusing on the EM and em algorithms, and proves a condition that guarantees their equivalence. Expand
...
1
2
3
4
...