Pattern Learning and Recognition on Statistical Manifolds: An Information-Geometric Review

@inproceedings{Nielsen2013PatternLA,
  title={Pattern Learning and Recognition on Statistical Manifolds: An Information-Geometric Review},
  author={Frank Nielsen},
  booktitle={SIMBAD},
  year={2013}
}
  • F. Nielsen
  • Published in SIMBAD 3 July 2013
  • Computer Science
We review the information-geometric framework for statistical pattern recognition: First, we explain the role of statistical similarity measures and distances in fundamental statistical pattern recognition problems. We then concisely review the main statistical distances and report a novel versatile family of divergences. Depending on their intrinsic complexity, the statistical patterns are learned by either atomic parametric distributions, semi-parametric finite mixtures, or non-parametric… 
Histogram-based embedding for learning on statistical manifolds
AbstractA novel binning and learning framework is presented for analyzing and applying large data sets that have no explicit knowledge of distribution parameterizations, and can only be assumed
On information projections between multivariate elliptical and location-scale families
TLDR
It is shown how to reduce the calculation of f -divergences between any two location-scale densities to canonical settings involving standard densities, and derive fast Monte Carlo estimators of f-diversgences with good properties.
Information geometry: Dualistic manifold structures and their uses
Information geometry: Dualistic manifold structures and their uses Frank Nielsen Sony Computer Science Laboratories Inc, Japan @FrnkNlsn Slides: FrankNielsen.github.com 15th July 2018, GiMLi
A geometric learning approach on the space of complex covariance matrices
TLDR
A geometric learning approach on the space of complex covariance matrices based on a new distribution called Riemannian Gaussian distribution is introduced and an application to texture recognition on the VisTex database is proposed.
Optimal transport vs. Fisher-Rao distance between copulas for clustering multivariate time series
TLDR
This work compares renowned distances between distributions: the Fisher-Rao geodesic distance, related divergences and optimal transport, and discusses their advantages and disadvantages.
Sentiment Classification Based on Information Geometry and Deep Belief Networks
TLDR
A sophisticated algorithm based on deep learning and information geometry in which the distribution of all training samples in the space is treated as prior knowledge and is encoded by deep belief networks (DBNs).
An Elementary Introduction to Information Geometry
TLDR
The fundamental differential-geometric structures of information manifolds are described, the fundamental theorem of information geometry is state, and some use cases of these information manifolding in information sciences are illustrated.

References

SHOWING 1-10 OF 112 REFERENCES
Multivariate Normal Distributions Parametrized as a Riemannian Symmetric Space
The construction of a distance function between probability distributions is of importance in mathematical statistics and its applications. The distance function based on the Fisher information
Entropies and cross-entropies of exponential families
  • F. Nielsen, R. Nock
  • Computer Science, Mathematics
    2010 IEEE International Conference on Image Processing
  • 2010
TLDR
This paper considers a versatile class of distributions called exponential families that encompasses many well-known distributions, such as Gaussian, Poisson, multinomial, Gamma/Beta and Dirichlet distributions, just to name a few and derives mathematical expressions for their Shannon entropy and cross-entropy.
An Information-Geometric Characterization of Chernoff Information
  • F. Nielsen
  • Computer Science
    IEEE Signal Processing Letters
  • 2013
TLDR
This work proves analytically that the Chernoff distance amounts to calculate an equivalent but simpler Bregman divergence defined on the distribution parameters, and proposes three novel information-theoretic symmetric distances and middle distributions, from which two of them admit always closed-form expressions.
$\alpha$ -Divergence Is Unique, Belonging to Both $f$-Divergence and Bregman Divergence Classes
  • S. Amari
  • Computer Science
    IEEE Transactions on Information Theory
  • 2009
TLDR
It is proved that the alpha-divergences constitute a unique class belonging to both classes when the space of positive measures or positive arrays is considered, and this is the only such one in thespace of probability distributions.
A New Closed-Form Information Metric for Shape Analysis
TLDR
A new Riemannian metric based on generalized phi-entropy measures is proposed, which is available in closed-form for the mixture model and Discriminative capabilities of this new metric are studied by pairwise matching of corpus callosum shapes.
Markov invariant geometry on manifolds of states
This paper is devoted to certain differential-geometric constructions in classical and noncommutative statistics, invariant with respect to the category of Markov maps, which have recently been
Emerging Trends in Visual Computing, LIX Fall Colloquium, ETVC 2008, Palaiseau, France, November 18-20, 2008. Revised Invited Papers
TLDR
Information Theoretic Methods for Diffusion-Weighted MRI Analysis and Statistical Computing on Manifolds: From Riemannian Geometry to Computational Anatomy.
Interactions between Symmetric Cone and Information Geometries: Bruhat-Tits and Siegel Spaces Models for High Resolution Autoregressive Doppler Imagery
TLDR
For Complex Autoregressive Model, Kahler metric on reflection coefficients based on Kahler potential function given by Doppler signal Entropy is introduced, closely related to Kahler-Einstein manifold and complex Monge-Ampere Equation.
Exponential statistical manifold
We consider the non-parametric statistical model ε(p) of all positive densities q that are connected to a given positive density p by an open exponential arc, i.e. a one-parameter exponential model
...
...