Corpus ID: 236171259

Neural Variational Gradient Descent

@article{Langosco2021NeuralVG,
  title={Neural Variational Gradient Descent},
  author={Lauro Langosco di Langosco and Vincent Fortuin and Heiko Strathmann},
  journal={ArXiv},
  year={2021},
  volume={abs/2107.10731}
}
Particle-based approximate Bayesian inference approaches such as Stein Variational Gradient Descent (SVGD) combine the flexibility and convergence guarantees of sampling methods with the computational benefits of variational inference. In practice, SVGD relies on the choice of an appropriate kernel function, which impacts its ability to model the target distribution—a challenging problem with only heuristic solutions. We propose Neural Variational Gradient Descent (NVGD), which is based on… Expand

Figures from this paper

References

SHOWING 1-10 OF 49 REFERENCES
Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm
We propose a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization. Our method iteratively transports a set of particles to match theExpand
A Stein variational Newton method
TLDR
This paper accelerates and generalizes the SVGD algorithm by including second-order information, thereby approximating a Newton-like iteration in function space and shows how second- order information can lead to more effective choices of kernel. Expand
Function Space Particle Optimization for Bayesian Neural Networks
TLDR
This paper demonstrates through extensive experiments that their method successfully overcomes this issue, and outperforms strong baselines in a variety of tasks including prediction, defense against adversarial examples, and reinforcement learning. Expand
On Stein Variational Neural Network Ensembles
TLDR
It is found that SVGD using functional and hybrid kernels can overcome the limitations of deep ensembles and improves on functional diversity and uncertainty estimation and approaches the true Bayesian posterior more closely. Expand
Kernel Stein Generative Modeling
TLDR
Noise conditional kernel SVGD (NCK-SVGD), that works in tandem with the recently introduced Noise Conditional Score Network estimator, and offers a flexible control between sample quality and diversity in gradient-based Explicit Generative Modeling. Expand
Message Passing Stein Variational Gradient Descent
TLDR
Experimental results show MP-SVGD's advantages of preventing vanishing repulsive force in high-dimensional space over SVGD, and its particle efficiency and approximation flexibility over other inference methods on graphical models. Expand
Stein Variational Gradient Descent as Gradient Flow
  • Q. Liu
  • Computer Science, Mathematics
  • NIPS
  • 2017
TLDR
This paper develops the first theoretical analysis on SVGD, discussing its weak convergence properties and showing that its asymptotic behavior is captured by a gradient flow of the KL divergence functional under a new metric structure induced by Stein operator. Expand
What Are Bayesian Neural Network Posteriors Really Like?
TLDR
It is shown that BNNs can achieve significant performance gains over standard training and deep ensembles, and a single long HMC chain can provide a comparable representation of the posterior to multiple shorter chains, and posterior tempering is not needed for near-optimal performance. Expand
Priors in Bayesian Deep Learning: A Review
TLDR
An overview of different priors that have been proposed for (deep) Gaussian processes, variational autoencoders, and Bayesian neural networks is presented and different methods of learning priors for these models from data are outlined. Expand
Bayesian Learning via Stochastic Gradient Langevin Dynamics
In this paper we propose a new framework for learning from large scale datasets based on iterative learning from small mini-batches. By adding the right amount of noise to a standard stochasticExpand
...
1
2
3
4
5
...