Compositional uncertainty in deep Gaussian processes
@inproceedings{Ustyuzhaninov2020CompositionalUI, title={Compositional uncertainty in deep Gaussian processes}, author={Ivan Ustyuzhaninov and Ieva Kazlauskaite and M. Kaiser and Erik Bodin and N. Campbell and C. Ek}, booktitle={UAI}, year={2020} }
Gaussian processes (GPs) are nonparametric priors over functions. Fitting a GP implies computing a posterior distribution of functions consistent with the observed data. Similarly, deep Gaussian processes (DGPs) should allow us to compute a posterior distribution of compositions of multiple functions giving rise to the observations. However, exact Bayesian inference is intractable for DGPs, motivating the use of various approximations. We show that the application of simplifying mean-field… Expand
9 Citations
Beyond the Mean-Field: Structured Deep Gaussian Processes Improve the Predictive Uncertainties
- Computer Science, Mathematics
- NeurIPS
- 2020
- Highly Influenced
- PDF
Global inducing point variational posteriors for Bayesian neural networks and deep Gaussian processes
- Computer Science, Mathematics
- ArXiv
- 2020
- 8
- PDF
Characterizing Deep Gaussian Processes via Nonlinear Recurrence Systems
- Computer Science, Mathematics
- ArXiv
- 2020
- PDF
References
SHOWING 1-10 OF 38 REFERENCES
Doubly Stochastic Variational Inference for Deep Gaussian Processes
- Computer Science, Mathematics
- NIPS
- 2017
- 185
- Highly Influential
- PDF
Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo
- Computer Science, Mathematics
- NeurIPS
- 2018
- 44
- Highly Influential
- PDF
Deep Gaussian processes and variational propagation of uncertainty
- Mathematics
- 2015
- 71
- Highly Influential
- PDF