• Corpus ID: 15177472

Metrics for Probabilistic Geometries

@article{Tosi2014MetricsFP,
  title={Metrics for Probabilistic Geometries},
  author={Alessandra Tosi and S{\o}ren Hauberg and Alfredo Vellido and Neil D. Lawrence},
  journal={ArXiv},
  year={2014},
  volume={abs/1411.7432}
}
We investigate the geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry. We explicitly define a distribution over the natural metric given by the models. We provide the necessary algorithms to compute expected metric tensors where the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian… 

Expected path length on random manifolds

This work endow the latent space of a large class of generative models with a random Riemannian metric, which provides them with elementary operators and researches deterministic approximations and tight error bounds on expected distances.

Geometrically Enriched Latent Spaces

This work considers the ambient space to be a Riemannian manifold, which allows for encoding domain knowledge through the associated RiemANNian metric, and considers shortest paths to be defined accordingly in the latent space to both follow the learned manifold and respect the ambient geometry.

A prior-based approximate latent Riemannian metric

This work proposes a surrogate conformal Riemannian metric in the latent space of a generative model that is simple, efficient and robust, and shows the applicability of the proposed methodology for data analysis in the life sciences.

Bayesian Quadrature on Riemannian Data Manifolds

This work focuses on Bayesian quadrature ( bq) to numerically compute integrals over normal laws on Riemannian manifolds learned from data and shows that by leveraging both prior knowledge and an active exploration scheme, bq outperforms Monte Carlo methods on a wide range of integration problems.

Only Bayes should learn a manifold (on the estimation of differential geometric structure from data)

This work first analyzes kernel-based algorithms and shows that under the usual regularizations, non-probabilistic methods cannot recover the differential geometric structure, but instead find mostly linear manifolds or spaces equipped with teleports, thereby highlighting geometric and probabilistic shortcomings of current deep generative models.

Only Bayes should learn a manifold

This document investigates learning of the differential geometric structure of a data manifold embedded in a high-dimensional Euclidean space, and partly extends the analysis to models based on neural networks, thereby highlighting geometric and probabilistic shortcomings of current deep generative models.

Maximum Likelihood Estimation of Riemannian Metrics from Euclidean Data

This work proposes to re-normalize likelihoods with respect to the usual Lebesgue measure of the data space, and to bound the likelihood when its exact value is unattainable.

A Locally Adaptive Normal Distribution

This work develops a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration, and extends the LAND to mixture models, and provides the corresponding EM algorithm.

Isometric Gaussian Process Latent Variable Model for Dissimilarity Data

A fully generative model where the latent variable respects both the distances and the topology of the modeled data and can encode invariances in the learned manifolds is proposed.

Fast Approximate Geodesics for Deep Generative Models

This work proposes finding shortest paths in a finite graph of samples from the aggregate approximate posterior, that can be solved exactly, at greatly reduced runtime, and without a notable loss in quality.

References

SHOWING 1-10 OF 35 REFERENCES

A Geometric take on Metric Learning

It is proved that, with appropriate changes, multi-metric learning corresponds to learning the structure of a Riemannian manifold, and it is shown that this structure gives a principled way to perform dimensionality reduction and regression according to the learned metrics.

Probabilistic Solutions to Differential Equations and their Application to Riemannian Statistics

This work studies a probabilistic numerical method for the solution of both boundary and initial value problems that returns a joint Gaussian process posterior over the solution that permits marginalising the uncertainty of the numerical solution such that statistics are less sensitive to inaccuracies.

Riemannian Geometry

THE recent physical interpretation of intrinsic differential geometry of spaces has stimulated the study of this subject. Riemann proposed the generalisation, to spaces of any order, of Gauss's

Local Distance Functions: A Taxonomy, New Algorithms, and an Evaluation

  • D. RamananS. Baker
  • Computer Science
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2011
A taxonomy for local distance functions where most existing algorithms can be regarded as approximations of the geodesic distance defined by a metric tensor and hybrid algorithms that use a combination of techniques to ameliorate overfitting are introduced.

A Unifying Probabilistic Perspective for Spectral Dimensionality Reduction: Insights and New Models

A new perspective on spectral dimensionality reduction is introduced which views these methods as Gaussian Markov random fields (GRFs) as well as a variant of LLE that performs maximum likelihood exactly: Acyclic LLE (ALLE).

Nonlinear Dimensionality Reduction

The purpose of the book is to summarize clear facts and ideas about well-known methods as well as recent developments in the topic of nonlinear dimensionality reduction, which encompasses many of the recently developed methods.

Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models

A novel probabilistic interpretation of principal component analysis (PCA) that is based on a Gaussian process latent variable model (GP-LVM), and related to popular spectral techniques such as kernel PCA and multidimensional scaling.

Variational Gaussian Process Dynamical Systems

This work builds on recent variational approximations for Gaussian process latent variable models to allow for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space.

Priors for people tracking from small training sets

It is shown that the SGPLVM sufficiently constrains the problem such that tracking can be accomplished with straightforward deterministic optimization.

Gaussian Process Dynamical Models for Human Motion

This work marginalize out the model parameters in closed form by using Gaussian process priors for both the dynamical and the observation mappings, which results in a nonparametric model for dynamical systems that accounts for uncertainty in the model.