A Locally Adaptive Normal Distribution
@inproceedings{Arvanitidis2016ALA, title={A Locally Adaptive Normal Distribution}, author={Georgios Arvanitidis and Lars Kai Hansen and S{\o}ren Hauberg}, booktitle={NIPS}, year={2016} }
The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density. The resulting locally adaptive normal distribution (LAND) is a generalization of the normal distribution to the "manifold" setting, where data is assumed to lie near a potentially low-dimensional manifold…
17 Citations
Maximum Likelihood Estimation of Riemannian Metrics from Euclidean Data
- Mathematics, Computer ScienceGSI
- 2017
This work proposes to re-normalize likelihoods with respect to the usual Lebesgue measure of the data space, and to bound the likelihood when its exact value is unattainable.
Directional Statistics with the Spherical Normal Distribution
- Mathematics2018 21st International Conference on Information Fusion (FUSION)
- 2018
This work develops efficient inference techniques for data distributed by the curvature-aware spherical normal distribution, and derives closed-form expressions for the normalization constant when the distribution is isotropic, and a fast and accurate approximation for the anisotropic case on the two-sphere.
Bayesian Quadrature on Riemannian Data Manifolds
- Mathematics, Computer ScienceICML
- 2021
This work focuses on Bayesian quadrature ( bq) to numerically compute integrals over normal laws on Riemannian manifolds learned from data and shows that by leveraging both prior knowledge and an active exploration scheme, bq outperforms Monte Carlo methods on a wide range of integration problems.
A prior-based approximate latent Riemannian metric
- Computer ScienceAISTATS
- 2022
This work proposes a surrogate conformal Riemannian metric in the latent space of a generative model that is simple, efficient and robust, and shows the applicability of the proposed methodology for data analysis in the life sciences.
Least-Squares Log-Density Gradient Clustering for Riemannian Manifolds
- Computer ScienceAISTATS
- 2017
This paper proposes a novel mode-seeking algorithm for Riemannian manifolds with direct log-density gradient estimation and provides a mathematically sound algorithm and demonstrates its usefulness through experiments.
Variational Autoencoders with Riemannian Brownian Motion Priors
- Computer ScienceICML
- 2020
This work assumes a Riemannian structure over the latent space, which constitutes a more principled geometric view of the latent codes, and replaces the standard Gaussian prior with a R Siemannian Brownian motion prior, and demonstrates that this prior significantly increases model capacity using only one additional scalar parameter.
Computational and statistical methods for trajectory analysis in a Riemannian geometry setting. (Méthodes numériques et statistiques pour l'analyse de trajectoire dans un cadre de geométrie Riemannienne)
- Mathematics
- 2019
This PhD proposes new Riemannian geometry tools for the analysis of longitudinal observations of neuro-degenerative subjects by proposing a numerical scheme to compute the parallel transport along geodesics and tackling the issue of RiemANNian manifold learning.
Riemannian Metric Learning via Optimal Transport
- Computer Science, MathematicsArXiv
- 2022
An optimal transport-based model for learning a metric tensor from cross-sectional samples of evolving probability measures on a common Riemannian manifold is introduced and it is shown that metrics learned using this method improve the quality of trajectory inference on scRNA and bird migration data at the cost of little additional cross-sections.
Latent Space Oddity: on the Curvature of Deep Generative Models
- Computer ScienceICLR
- 2018
This work shows that the nonlinearity of the generator imply that the latent space gives a distorted view of the input space, and shows that this distortion can be characterized by a stochastic Riemannian metric, and demonstrates that distances and interpolants are significantly improved under this metric.
Fast and Robust Shortest Paths on Manifolds Learned from Data
- Computer ScienceAISTATS
- 2019
A fast, simple and robust algorithm for computing shortest paths and distances on Riemannian manifolds learned from data that enhances the stability of the solver, while reduces the computational cost.
References
SHOWING 1-10 OF 25 REFERENCES
Geodesic Finite Mixture Models
- Computer Science, MathematicsBMVC
- 2014
We present a novel approach for learning a finite mixture model on a Riemannian manifold in which Euclidean metrics are not applicable and one needs to resort to geodesic distances consistent with…
Metrics for Probabilistic Geometries
- Computer Science, MathematicsUAI
- 2014
The geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry is investigated and distances that respect the expected metric lead to more appropriate generation of new data.
Intrinsic Statistics on Riemannian Manifolds: Basic Tools for Geometric Measurements
- MathematicsJournal of Mathematical Imaging and Vision
- 2006
This paper provides a new proof of the characterization of Riemannian centers of mass and an original gradient descent algorithm to efficiently compute them and develops the notions of mean value and covariance matrix of a random element, normal law, Mahalanobis distance and χ2 law.
Probabilistic Solutions to Differential Equations and their Application to Riemannian Statistics
- Computer Science, MathematicsAISTATS
- 2014
This work studies a probabilistic numerical method for the solution of both boundary and initial value problems that returns a joint Gaussian process posterior over the solution that permits marginalising the uncertainty of the numerical solution such that statistics are less sensitive to inaccuracies.
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
- Computer ScienceJ. Mach. Learn. Res.
- 2006
A semi-supervised framework that incorporates labeled and unlabeled data in a general-purpose learner is proposed and properties of reproducing kernel Hilbert spaces are used to prove new Representer theorems that provide theoretical basis for the algorithms.
Riemannian Geometry
- MathematicsNature
- 1927
THE recent physical interpretation of intrinsic differential geometry of spaces has stimulated the study of this subject. Riemann proposed the generalisation, to spaces of any order, of Gauss's…
Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Computer ScienceNeural Computation
- 2003
This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering.
Nonlinear dimensionality reduction by locally linear embedding.
- Computer ScienceScience
- 2000
Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.
An F-Measure for Evaluation of Unsupervised Clustering with Non-Determined Number of Clusters
- Computer Science
- 2008
A method is suggested that adapts the Fmeasure for supervised classification to the unsupervised case and introduces a mapping matrix that is first constructed by using the onset matching technique presented in [1] adapted to multiple classes.
A tutorial on spectral clustering
- Computer ScienceStat. Comput.
- 2007
This tutorial describes different graph Laplacians and their basic properties, present the most common spectral clustering algorithms, and derive those algorithms from scratch by several different approaches.