Geometric Variational Inference
@article{Frank2021GeometricVI, title={Geometric Variational Inference}, author={Philipp Frank and Reimar H. Leike and Torsten A. Ensslin}, journal={Entropy}, year={2021}, volume={23} }
Efficiently accessing the information contained in non-linear and high dimensional probability distributions remains a core challenge in modern statistics. Traditionally, estimators that go beyond point estimates are either categorized as Variational Inference (VI) or Markov-Chain Monte-Carlo (MCMC) techniques. While MCMC methods that utilize the geometric properties of continuous probability distributions to increase their efficiency have been proposed, VI methods rarely use the geometry. This…
Figures and Tables from this paper
8 Citations
Geometric Variational Inference and Its Application to Bayesian Imaging
- 2022
Computer Science, Mathematics
MaxEnt 2022
GeoVI has recently been introduced as an accurate Variational Inference technique for nonlinear unimodal probability distributions that enables its application to real-world astrophysical imaging problems in millions of dimensions.
Information Field Theory and Artificial Intelligence
- 2022
Computer Science
Entropy
This paper reformulated the process of inference in IFT in terms of GNN training, suggesting that IFT is well suited to address many problems in AI and ML research and application.
Butterfly Transforms for Efficient Representation of Spatially Variant Point Spread Functions in Bayesian Imaging
- 2023
Computer Science
Entropy
This work combines butterfly transforms in several ways into butterfly networks, compares the different architectures with respect to their performance and identifies a representation that is suitable for the efficient representation of a synthetic spatially variant point spread function up to a 1% error.
Reconstructing the universe with variational self-boosted sampling
- 2023
Computer Science
Journal of Cosmology and Astroparticle Physics
A hybrid scheme, called variational self-boosted sampling (VBS) is developed to mitigate the drawbacks of both these algorithms by learning a variational approximation for the proposal distribution of Monte Carlo sampling and combine it with HMC.
Efficient Representations of Spatially Variant Point Spread Functions with Butterfly Transforms in Bayesian Imaging Algorithms
- 2022
Computer Science
MaxEnt 2022
This work discusses the application of butterfly transforms, which are linear neural network structures whose sizes scale subquadratically with the number of data points whose shapes are inspired by the structure of the Cooley–Tukey Fast Fourier transform.
Sparse Kernel Gaussian Processes through Iterative Charted Refinement (ICR)
- 2022
Computer Science
ArXiv
A new, generative method named Iterative Charted Refinement (ICR) is presented to model GPs on nearly arbitrarily spaced points in O ( N ) time for decaying kernels without nested optimizations and its accuracy is comparable to state-of-the-art GP methods.
Probabilistic Autoencoder Using Fisher Information
- 2021
Computer Science
Entropy
In this work, an extension to the autoencoder architecture is introduced, the FisherNet, which has advantages from a theoretical point of view as it provides a direct uncertainty quantification derived from the model and also accounts for uncertainty cross-correlations.
40 References
A Geometric Variational Approach to Bayesian Inference
- 2020
Computer Science
Journal of the American Statistical Association
The proposed Riemannian geometric framework for variational inference in Bayesian models based on the nonparametric Fisher–Rao metric on the manifold of probability density functions and a novel gradient-based algorithm for the variational problem based on Fréchet derivative operators motivated by the geometry of , are proposed.
Metric Gaussian Variational Inference
- 2019
Computer Science
ArXiv
The proposed Metric Gaussian Variational Inference (MGVI) is an iterative method that performs a series of Gaussian approximations to the posterior that achieves linear scaling by avoiding to store the covariance explicitly at any time.
Variational Inference: A Review for Statisticians
- 2016
Computer Science
ArXiv
Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived.
Variational Inference with Normalizing Flows
- 2015
Computer Science, Mathematics
ICML
It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.
Auto-Encoding Variational Bayes
- 2014
Computer Science
ICLR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Automatic Differentiation Variational Inference
- 2017
Computer Science
J. Mach. Learn. Res.
Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.
Stochastic variational inference
- 2013
Computer Science
J. Mach. Learn. Res.
Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart.
A stochastic estimator of the trace of the influence matrix for laplacian smoothing splines
- 1989
Mathematics
An unbiased stochastic estimator of tr(I-A), where A is the influence matrix associated with the calculation of Laplacian smoothing splines, is described. The estimator is similar to one recently…
A General Metric for Riemannian Manifold Hamiltonian Monte Carlo
- 2013
Mathematics
GSI
A new metric for RMHMC is proposed without limitations and its success on a distribution that emulates many hierarchical and latent models is verified.