Geometric Variational Inference

  title={Geometric Variational Inference},
  author={Philipp Frank and Reimar H. Leike and Torsten A. Ensslin},
Efficiently accessing the information contained in non-linear and high dimensional probability distributions remains a core challenge in modern statistics. Traditionally, estimators that go beyond point estimates are either categorized as Variational Inference (VI) or Markov-Chain Monte-Carlo (MCMC) techniques. While MCMC methods that utilize the geometric properties of continuous probability distributions to increase their efficiency have been proposed, VI methods rarely use the geometry. This… 

Geometric Variational Inference and Its Application to Bayesian Imaging

    P. Frank
    Computer Science, Mathematics
    MaxEnt 2022
  • 2022
GeoVI has recently been introduced as an accurate Variational Inference technique for nonlinear unimodal probability distributions that enables its application to real-world astrophysical imaging problems in millions of dimensions.

Information Field Theory and Artificial Intelligence

This paper reformulated the process of inference in IFT in terms of GNN training, suggesting that IFT is well suited to address many problems in AI and ML research and application.

Butterfly Transforms for Efficient Representation of Spatially Variant Point Spread Functions in Bayesian Imaging

This work combines butterfly transforms in several ways into butterfly networks, compares the different architectures with respect to their performance and identifies a representation that is suitable for the efficient representation of a synthetic spatially variant point spread function up to a 1% error.

Reconstructing the universe with variational self-boosted sampling

    C. ModiYin LiD. Blei
    Computer Science
    Journal of Cosmology and Astroparticle Physics
  • 2023
A hybrid scheme, called variational self-boosted sampling (VBS) is developed to mitigate the drawbacks of both these algorithms by learning a variational approximation for the proposal distribution of Monte Carlo sampling and combine it with HMC.

Efficient Representations of Spatially Variant Point Spread Functions with Butterfly Transforms in Bayesian Imaging Algorithms

This work discusses the application of butterfly transforms, which are linear neural network structures whose sizes scale subquadratically with the number of data points whose shapes are inspired by the structure of the Cooley–Tukey Fast Fourier transform.

Sparse Kernel Gaussian Processes through Iterative Charted Refinement (ICR)

A new, generative method named Iterative Charted Refinement (ICR) is presented to model GPs on nearly arbitrarily spaced points in O ( N ) time for decaying kernels without nested optimizations and its accuracy is comparable to state-of-the-art GP methods.

Probabilistic Autoencoder Using Fisher Information

In this work, an extension to the autoencoder architecture is introduced, the FisherNet, which has advantages from a theoretical point of view as it provides a direct uncertainty quantification derived from the model and also accounts for uncertainty cross-correlations.

A Geometric Variational Approach to Bayesian Inference

The proposed Riemannian geometric framework for variational inference in Bayesian models based on the nonparametric Fisher–Rao metric on the manifold of probability density functions and a novel gradient-based algorithm for the variational problem based on Fréchet derivative operators motivated by the geometry of , are proposed.

Metric Gaussian Variational Inference

The proposed Metric Gaussian Variational Inference (MGVI) is an iterative method that performs a series of Gaussian approximations to the posterior that achieves linear scaling by avoiding to store the covariance explicitly at any time.

Variational Inference: A Review for Statisticians

Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived.

Variational Inference with Normalizing Flows

It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.

Auto-Encoding Variational Bayes

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

Automatic Differentiation Variational Inference

Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.

Stochastic variational inference

Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart.

A stochastic estimator of the trace of the influence matrix for laplacian smoothing splines

An unbiased stochastic estimator of tr(I-A), where A is the influence matrix associated with the calculation of Laplacian smoothing splines, is described. The estimator is similar to one recently

A General Metric for Riemannian Manifold Hamiltonian Monte Carlo

A new metric for RMHMC is proposed without limitations and its success on a distribution that emulates many hierarchical and latent models is verified.