• Corpus ID: 211011067

Automatic structured variational inference

@article{Ambrogioni2021AutomaticSV,
  title={Automatic structured variational inference},
  author={Luca Ambrogioni and Max Hinne and Marcel van Gerven},
  journal={ArXiv},
  year={2021},
  volume={abs/2002.00643}
}
The aim of probabilistic programming is to automatize every aspect of probabilistic inference in arbitrary probabilistic models (programs) so that the user can focus her attention on modeling, without dealing with ad-hoc inference methods. Gradient based automatic differentiation stochastic variational inference offers an attractive option as the default method for (differentiable) probabilistic programming as it combines high performance with high computational efficiency. However, the… 

Figures and Tables from this paper

Nested Variational Inference
TLDR
NVI is developed, a family of methods that learn proposals for nested importance samplers by minimizing an forward or reverse KL divergence at each level of nesting, and it is observed that optimizing nested objectives leads to improved sample quality in terms of log average weight and effective sample size.
Automatic Backward Filtering Forward Guiding for Markov processes and graphical models
TLDR
This work backpropagate the information provided by observations through the model to transform the generative (forward) model into a pre-conditional model guided by the data, which approximates the actual conditional model with known likelihood-ratio between the two.
ADAVI: Automatic Dual Amortized Variational Inference Applied To Pyramidal Bayesian Models
TLDR
A novel methodology that automatically produces a variational family dual to a target HBM, represented as a neural network that reduces by orders of magnitude its parameterization with respect to that of a typical flow-based representation, while maintaining expressivity.
Automatic variational inference with cascading flows
TLDR
Cascading flows are introduced, a new family of variational programs that can be constructed automatically from an input probabilistic program and can also be amortized automatically that have much higher performance than both normalizing flows and ASVI in a large set of structured inference problems.
Embedded-model flows: Combining the inductive biases of model-free deep learning and explicit probabilistic modeling
TLDR
Embedded-model flows (EMF), which alternate general-purpose transformations with structured layers that embed domain-specific inductive biases, and enable a high performance form of variational inference where the structure of the prior model is embedded in the variational architecture.
Black Box Variational Bayesian Model Averaging
TLDR
A Variational Bayesian Inference approach to BMA is presented as a viable alternative to the standard solutions which avoids many of the aforementioned pitfalls and can be readily applied to many models with little to no model-speciation derivation.
Knowledge Distillation via Constrained Variational Inference
TLDR
This paper proposes a framework for distilling the knowledge of a powerful discriminative model such as a neural network into commonly used graphical models known to be more interpretable, and constrains variational inference for posterior variables in graphical models with a similarity preserving constraint.
Amortized Variational Inference for Simple Hierarchical Models
TLDR
This paper suggests an amortized approach where shared parameters simultaneously represent all local distributions, which is similarly accurate as using a given joint distribution but is feasible on datasets that are several orders of magnitude larger.
Neuroscience-inspired perception-action in robotics: applying active inference for state estimation, control and self-perception
TLDR
How neuroscience findings open up opportunities to improve current estimation and control algorithms in robotics is discussed, and active inference, a mathematical formulation of how the brain resists a natural tendency to disorder, provides a unified recipe to potentially solve some of the major challenges in robotics.

References

SHOWING 1-10 OF 46 REFERENCES
Automatic Variational Inference in Stan
TLDR
An automatic variational inference algorithm, automatic differentiation Variational inference (ADVI), which is implemented in Stan, a probabilistic programming system and can be used on any model the authors write in Stan.
Auto-Encoding Variational Bayes
TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Automatic Differentiation Variational Inference
TLDR
Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.
Black Box Variational Inference
TLDR
This paper presents a "black box" variational inference algorithm, one that can be quickly applied to many models with little additional derivation, based on a stochastic optimization of the variational objective where the noisy gradient is computed from Monte Carlo samples from the Variational distribution.
Copula variational inference
TLDR
A general variational inference method that preserves dependency among the latent variables by using copulas to augment the families of distributions used in mean-field and structured approximations.
Hierarchical Variational Models
TLDR
This work develops hierarchical variational models (HVMs), which augment a variational approximation with a prior on its parameters, which allows it to capture complex structure for both discrete and continuous latent variables.
Stochastic variational inference for hidden Markov models
TLDR
An SVI algorithm is developed that harnesses the memory decay of the chain to adaptively bound errors arising from edge effects and demonstrates the effectiveness of the algorithm on synthetic experiments and a large genomics dataset where a batch algorithm is computationally infeasible.
Stochastic variational inference
TLDR
Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart.
Automated Variational Inference in Probabilistic Programming
TLDR
A new algorithm for approximate inference in probabilistic programs, based on a stochastic gradient for variational programs, is presented, which is efficient without restrictions on the probabilists and improves inference efficiency over other algorithms.
Adam: A Method for Stochastic Optimization
TLDR
This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
...
1
2
3
4
5
...