Corpus ID: 208006088

Scalable Exact Inference in Multi-Output Gaussian Processes

@inproceedings{Bruinsma2020ScalableEI,
  title={Scalable Exact Inference in Multi-Output Gaussian Processes},
  author={Wessel P. Bruinsma and Eric Perim and Will Tebbutt and J Scott Hosking and Arno Solin and Richard E. Turner},
  booktitle={ICML},
  year={2020}
}
Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while capturing structure across outputs, which is desirable, for example, in spatio-temporal modelling. The key problem with MOGPs is their computational scaling $O(n^3 p^3)$, which is cubic in the number of both inputs $n$ (e.g., time points or locations) and outputs $p$. For this reason, a popular class of MOGPs assumes that the data live around a low-dimensional linear subspace, reducing the… Expand
Bayesian Inference in High-Dimensional Time-Serieswith the Orthogonal Stochastic Linear Mixing Model
TLDR
A new regression framework, the Orthogonal stochastic linear mixing model (OSLMM) is proposed that introduces an orthogonal constraint amongst the mixing coefficients that reduces the computational burden of inference while retaining the capability to handle complex output dependence. Expand
Modeling massive highly-multivariate nonstationary spatial data with the basis graphical lasso
TLDR
A new modeling framework for highly-multivariate spatial processes is proposed that synthesizes ideas from recent multiscale and spectral approaches with graphical models and motivates a model where the basis functions represent different levels of resolution and the graphical vectors for each level are assumed to be independent. Expand
Leveraging Probabilistic Circuits for Nonparametric Multi-Output Regression
TLDR
It is shown that inference can be performed exactly and efficiently in the model, that it can capture correlations between output dimensions and, hence, often outperforms approaches that do not incorporate inter-output correlations, as demonstrated on several data sets in terms of the negative log predictive density. Expand
Modeling massive multivariate spatial data with the basis graphical lasso
TLDR
This paper proposes a new modeling framework for highly multivariate spatial processes that synthesizes ideas from recent multiscale and spectral approaches with graphical models and motivates a model where the basis functions represent different levels of resolution and the graphical vectors for each level are assumed to be independent. Expand
Bayesian Optimization with High-Dimensional Outputs
TLDR
An efficient technique is devised that combines exploiting Kronecker structure in the covariance matrices with Matheron’s identity, allowing us to perform Bayesian Optimization using exact multi-task GP models with tens of thousands of correlated outputs, achieving substantial improvements in sample efficiency. Expand
Gaussian Processes for Probabilistic Electricity Price Forecasting
Probabilistic electricity price forecasting (PEPF) has become a crucial component for energy systems planning and decision making in this day and age. Point predictions are unable to quantify theExpand
RECOWNs: Probabilistic Circuits for Trustworthy Time Series Forecasting
TLDR
Recurrent Neural Networks are shown to be accurate and trustworthy time series predictors, able to “know when they do not know”, and a Log-Likelihood Ratio Score is formulated as better estimation of uncertainty that is tailored to time series and Whittle likelihoods. Expand
Scalable Multi-Task Gaussian Processes with Neural Embedding of Coregionalization
  • Haitao Liu, Jiaqi Ding, Xinyu Xie, Xiaomo Jiang, Yusong Zhao, Xiaofang Wang
  • Computer Science, Mathematics
  • ArXiv
  • 2021
TLDR
The neural embedding of coregionalization is developed that transforms the latent GPs into a high-dimensional latent space to induce rich yet diverse behaviors and uses advanced variational inference as well as sparse approximation to devise a tight and compact evidence lower bound (ELBO) for higher quality of scalable model inference. Expand
Conditional Deep Gaussian Processes: multi-fidelity kernel learning
TLDR
This work proposes the conditional DGP model in which the latent GPs are directly supported by the fixed lower fidelity data and the effective kernel encodes the inductive bias for true function allowing the compositional freedom discussed in [3,4]. Expand
Deep Moment Matching Kernel for Multi-source Gaussian Processes.
TLDR
Results show GP regression with the DMM kernels is effective when applying to the standard synthetic and real-world multi-fidelity data sets. Expand
...
1
2
...

References

SHOWING 1-10 OF 83 REFERENCES
Variational Learning of Inducing Variables in Sparse Gaussian Processes
  • M. Titsias
  • Mathematics, Computer Science
  • AISTATS
  • 2009
TLDR
A variational formulation for sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters by maximizing a lower bound of the true log marginal likelihood. Expand
Temporal Parallelization of Bayesian Smoothers
TLDR
Algorithms for temporal parallelization of Bayesian smoothers are presented, and the advantage of the proposed algorithms is that they reduce the linear complexity of standard smoothing algorithms with respect to time to logarithmic. Expand
Applied Stochastic Differential Equations
TLDR
The topic of this book is stochastic differential equations (SDEs), which are differential equations that produce a different “answer” or solution trajectory each time they are solved, and the emphasis is on applied rather than theoretical aspects of SDEs. Expand
Applied Stochastic Differential Equations. Institute of Mathematical Statistics Textbooks
  • 2019
Grouped Gaussian processes for solar power prediction
TLDR
This work considers multi-task regression models where the observations are assumed to be a linear combination of several latent node functions and weight functions, which are both drawn from Gaussian process priors, and proposes coupled priors over groups of processes to exploit spatial dependence between functions. Expand
Historical annual real-time LMPs
  • 2019
Scalable High-Order Gaussian Process Regression
TLDR
This work tensorizes the high-dimensional outputs, introducing latent coordinate features to index each tensor element and to capture their correlations, and generalizes a multilinear model to a hybrid of a GP and latent GP model, endowed with a Kronecker product structure over the inputs and the latent features. Expand
Temporal Parallelization of Bayesian Filters and Smoothers
TLDR
Algorithms for the temporal parallelization of the general Bayesian filtering and smoothing equations, and the specific linear/Gaussian models, and discrete hidden Markov models are presented. Expand
The Gaussian Process Autoregressive Regression Model (GPAR)
TLDR
GPAR is presented, a scalable multi-output GP model that is able to capture nonlinear, possibly input-varying, dependencies between outputs in a simple and tractable way and outperforming existing GP models and achieving state-of-the-art performance on established benchmarks. Expand
Bayesian Alignments of Warped Multi-Output Gaussian Processes
TLDR
An efficient variational approximation based on nested variational compression is presented and it is shown how the model can be used to extract shared information between dependent time series, recovering an interpretable functional decomposition of the learning problem. Expand
...
1
2
3
4
5
...