# Pattern Learning and Recognition on Statistical Manifolds: An Information-Geometric Review

@inproceedings{Nielsen2013PatternLA, title={Pattern Learning and Recognition on Statistical Manifolds: An Information-Geometric Review}, author={Frank Nielsen}, booktitle={SIMBAD}, year={2013} }

We review the information-geometric framework for statistical pattern recognition: First, we explain the role of statistical similarity measures and distances in fundamental statistical pattern recognition problems. We then concisely review the main statistical distances and report a novel versatile family of divergences. Depending on their intrinsic complexity, the statistical patterns are learned by either atomic parametric distributions, semi-parametric finite mixtures, or non-parametric…

## 7 Citations

Histogram-based embedding for learning on statistical manifolds

- Computer SciencePattern Analysis and Applications
- 2014

AbstractA novel binning and learning framework is presented for analyzing and applying large data sets that have no explicit knowledge of distribution parameterizations, and can only be assumed…

On information projections between multivariate elliptical and location-scale families

- MathematicsArXiv
- 2021

It is shown how to reduce the calculation of f -divergences between any two location-scale densities to canonical settings involving standard densities, and derive fast Monte Carlo estimators of f-diversgences with good properties.

Information geometry: Dualistic manifold structures and their uses

- Computer Science
- 2018

Information geometry: Dualistic manifold structures and their uses Frank Nielsen Sony Computer Science Laboratories Inc, Japan @FrnkNlsn Slides: FrankNielsen.github.com 15th July 2018, GiMLi…

A geometric learning approach on the space of complex covariance matrices

- Mathematics, Computer Science2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2017

A geometric learning approach on the space of complex covariance matrices based on a new distribution called Riemannian Gaussian distribution is introduced and an application to texture recognition on the VisTex database is proposed.

Optimal transport vs. Fisher-Rao distance between copulas for clustering multivariate time series

- Computer Science2016 IEEE Statistical Signal Processing Workshop (SSP)
- 2016

This work compares renowned distances between distributions: the Fisher-Rao geodesic distance, related divergences and optimal transport, and discusses their advantages and disadvantages.

Sentiment Classification Based on Information Geometry and Deep Belief Networks

- Computer ScienceIEEE Access
- 2018

A sophisticated algorithm based on deep learning and information geometry in which the distribution of all training samples in the space is treated as prior knowledge and is encoded by deep belief networks (DBNs).

An Elementary Introduction to Information Geometry

- MathematicsEntropy
- 2020

The fundamental differential-geometric structures of information manifolds are described, the fundamental theorem of information geometry is state, and some use cases of these information manifolding in information sciences are illustrated.

## References

SHOWING 1-10 OF 112 REFERENCES

Multivariate Normal Distributions Parametrized as a Riemannian Symmetric Space

- Mathematics
- 2000

The construction of a distance function between probability distributions is of importance in mathematical statistics and its applications. The distance function based on the Fisher information…

Entropies and cross-entropies of exponential families

- Computer Science, Mathematics2010 IEEE International Conference on Image Processing
- 2010

This paper considers a versatile class of distributions called exponential families that encompasses many well-known distributions, such as Gaussian, Poisson, multinomial, Gamma/Beta and Dirichlet distributions, just to name a few and derives mathematical expressions for their Shannon entropy and cross-entropy.

An Information-Geometric Characterization of Chernoff Information

- Computer ScienceIEEE Signal Processing Letters
- 2013

This work proves analytically that the Chernoff distance amounts to calculate an equivalent but simpler Bregman divergence defined on the distribution parameters, and proposes three novel information-theoretic symmetric distances and middle distributions, from which two of them admit always closed-form expressions.

$\alpha$ -Divergence Is Unique, Belonging to Both $f$-Divergence and Bregman Divergence Classes

- Computer ScienceIEEE Transactions on Information Theory
- 2009

It is proved that the alpha-divergences constitute a unique class belonging to both classes when the space of positive measures or positive arrays is considered, and this is the only such one in thespace of probability distributions.

A New Closed-Form Information Metric for Shape Analysis

- EngineeringMICCAI
- 2006

A new Riemannian metric based on generalized phi-entropy measures is proposed, which is available in closed-form for the mixture model and Discriminative capabilities of this new metric are studied by pairwise matching of corpus callosum shapes.

Markov invariant geometry on manifolds of states

- Mathematics
- 1991

This paper is devoted to certain differential-geometric constructions in classical and noncommutative statistics, invariant with respect to the category of Markov maps, which have recently been…

Simplification and hierarchical representations of mixtures of exponential families

- Computer ScienceSignal Process.
- 2010

Emerging Trends in Visual Computing, LIX Fall Colloquium, ETVC 2008, Palaiseau, France, November 18-20, 2008. Revised Invited Papers

- Computer Scienceetvc
- 2009

Information Theoretic Methods for Diffusion-Weighted MRI Analysis and Statistical Computing on Manifolds: From Riemannian Geometry to Computational Anatomy.

Interactions between Symmetric Cone and Information Geometries: Bruhat-Tits and Siegel Spaces Models for High Resolution Autoregressive Doppler Imagery

- MathematicsETVC
- 2008

For Complex Autoregressive Model, Kahler metric on reflection coefficients based on Kahler potential function given by Doppler signal Entropy is introduced, closely related to Kahler-Einstein manifold and complex Monge-Ampere Equation.

Exponential statistical manifold

- Mathematics
- 2006

We consider the non-parametric statistical model ε(p) of all positive densities q that are connected to a given positive density p by an open exponential arc, i.e. a one-parameter exponential model…