• Corpus ID: 211096566

Variational Autoencoders with Riemannian Brownian Motion Priors

@article{Kalatzis2020VariationalAW,
  title={Variational Autoencoders with Riemannian Brownian Motion Priors},
  author={Dimitris Kalatzis and David Eklund and Georgios Arvanitidis and S{\o}ren Hauberg},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.05227}
}
Variational Autoencoders (VAEs) represent the given data in a low-dimensional latent space, which is generally assumed to be Euclidean. This assumption naturally leads to the common choice of a standard Gaussian prior over continuous latent variables. Recent work has, however, shown that this prior has a detrimental effect on model capacity, leading to subpar performance. We propose that the Euclidean assumption lies at the heart of this failure mode. To counter this, we assume a Riemannian… 

Figures and Tables from this paper

A Metric Space for Point Process Excitations
TLDR
A Hidden Hawkes Geometry model is proposed to uncover the hidden geometry between event excitations in a multivariate point process, and it is demonstrated that learning the embedding alongside a point process uncovers salient interactions in a broad range of applications.
Preventing Manifold Intrusion with Locality: Local Mixup
TLDR
In constrained settings it is demonstrated that Local Mixup can create a trade-off between bias and variance, with the extreme cases reducing to vanilla training and classical Mixup.
Reactive Motion Generation on Learned Riemannian Manifolds
TLDR
It is argued that Riemannian manifolds may be learned via human demonstrations in which geodesics are natural motion skills, and proposed technique for facilitating on-the-fly end-effector/multiple-limb obstacle avoidance by reshaping the learned manifold using an obstacleaware ambient metric.
Visualizing Riemannian data with Rie-SNE
TLDR
The classic stochastic neighbor embedding algorithm is extended to data on general Riemannian manifolds and it is demonstrated that the approach also allows for mapping data from one manifold to another, e.g. from a high-dimensional sphere to a low-dimensional one.
Boltzmann Tuning of Generative Models
TLDR
The merits of the proposed approach, called Boltzmann Tuning of Generative Models (BTGM), applies to a wide range of applications, and covers conditional generative modelling as a particular case, and offers an affordable alternative to rejection sampling.
Data Augmentation in High Dimensional Low Sample Size Setting Using a Geometry-Based Variational Autoencoder
TLDR
A new method to perform data augmentation in a reliable way in the High Dimensional Low Sample Size (HDLSS) setting using a geometry-based variational autoencoder and a proper latent space modeling of the VAE seen as a Riemannian manifold with a new generation scheme which produces more meaningful samples especially in the context of small data sets.
Data Generation in Low Sample Size Setting Using Manifold Sampling and a Geometry-Aware VAE
TLDR
Two non prior-dependent generation procedures based on the geometry of the latent space seen as a Riemannian manifold which allows to greatly improve classification results on the OASIS database where balanced accuracy jumps from 80.7% for a classifier trained with the raw data to 89.1% when trained only with the synthetic data generated by the method.
EditVAE: Unsupervised Part-Aware Controllable 3D Point Cloud Shape Generation
TLDR
A latent representation of the point cloud which can be decomposed into a disentangled representation for each part of the shape is introduced, and the inductive bias introduced by the joint modeling approach yields state-of-the-art experimental results on the ShapeNet dataset.
GELATO: Geometrically Enriched Latent Model for Offline Reinforcement Learning
TLDR
This work imposes a latent representation of states and actions and leverage its intrinsic Riemannian geometry to measure distance of latent samples to the data and integrates its metrics in a model-based offline optimization framework, in which proximity and uncertainty can be carefully controlled.
Interventional Assays for the Latent Space of Autoencoders
TLDR
This work investigates "holes" in the representation to quantitatively ascertain to what extent the latent space of a trained VAE is consistent with the chosen prior, and uses the identified structure to improve interpolation between latent vectors.
...
1
2
3
...

References

SHOWING 1-10 OF 45 REFERENCES
Reliable training and estimation of variance networks
TLDR
A locally aware mini-batching scheme that result in sparse robust gradients is derived, and a heuristic for robustly fitting both the mean and variance networks post hoc is formulated.
Directional Statistics with the Spherical Normal Distribution
  • Søren Hauberg
  • Mathematics
    2018 21st International Conference on Information Fusion (FUSION)
  • 2018
TLDR
This work develops efficient inference techniques for data distributed by the curvature-aware spherical normal distribution, and derives closed-form expressions for the normalization constant when the distribution is isotropic, and a fast and accurate approximation for the anisotropic case on the two-sphere.
Latent Space Oddity: on the Curvature of Deep Generative Models
TLDR
This work shows that the nonlinearity of the generator imply that the latent space gives a distorted view of the input space, and shows that this distortion can be characterized by a stochastic Riemannian metric, and demonstrates that distances and interpolants are significantly improved under this metric.
Only Bayes should learn a manifold (on the estimation of differential geometric structure from data)
TLDR
This work first analyzes kernel-based algorithms and shows that under the usual regularizations, non-probabilistic methods cannot recover the differential geometric structure, but instead find mostly linear manifolds or spaces equipped with teleports, thereby highlighting geometric and probabilistic shortcomings of current deep generative models.
Back to the Future: Radial Basis Function Network Revisited
TLDR
This paper revisits some of the older approaches to training the RBF networks from a more modern perspective and provides a theoretical analysis of two common regularization procedures, one based on the square norm of the coefficients in the network and another one using centers obtained by <inline-formula><tex-math notation="LaTeX>-means clustering.
Variational Diffusion Autoencoders with Random Walk Sampling
TLDR
A principled measure for recognizing the mismatch between data and latent distributions and a method that combines the advantages of variational inference and diffusion maps to learn a homeomorphic generative model are proposed.
2019) highlighted the importance of optimizing the mean and variance components separately, when training VAEs with Gaussian generative models
  • 2019
A Differentiable Gaussian-like Distribution on Hyperbolic Space for Gradient-Based Learning
TLDR
A novel hyperbolic distribution calledpseudo-hyperbolic Gaussian, a Gaussian-like distribution on hyper bolic space whose density can be evaluated analytically and differentiated with respect to the parameters, enables the gradient-based learning of the probabilistic models onHyperbolic space that could never have been considered before.
A Wrapped Normal Distribution on Hyperbolic Space for Gradient-Based Learning
TLDR
A novel hyperbolic distribution calledpseudo-hyperbolic Gaussian, a Gaussian-like distribution on hyper bolic space whose density can be evaluated analytically and differentiated with respect to the parameters, enables the gradient-based learning of the probabilistic models onHyperbolic space that could never have been considered before.
Continuous Hierarchical Representations with Poincaré Variational Auto-Encoders
TLDR
This work endow VAEs with a Poincare ball model of hyperbolic geometry as a latent space and rigorously derive the necessary methods to work with two main Gaussian generalisations on that space.
...
1
2
3
4
5
...