# Compositionally-Warped Gaussian Processes

@article{Tobar2019CompositionallyWarpedGP, title={Compositionally-Warped Gaussian Processes}, author={Felipe A. Tobar and Gonzalo Rios}, journal={Neural networks : the official journal of the International Neural Network Society}, year={2019}, volume={118}, pages={ 235-246 } }

The Gaussian process (GP) is a nonparametric prior distribution over functions indexed by time, space, or other high-dimensional index set. The GP is a flexible model yet its limitation is given by its very nature: it can only model Gaussian marginal distributions. To model non-Gaussian data, a GP can be warped by a nonlinear transformation (or warping) as performed by warped GPs (WGPs) and more computationally-demanding alternatives such as Bayesian WGPs and deep GPs. However, the WGP requires…

## Figures, Tables, and Topics from this paper

## 15 Citations

Transport Gaussian Processes for Regression

- Computer Science, MathematicsArXiv
- 2020

This work proposes a methodology to construct stochastic processes, which include GPs, warpedGPs, Student-t processes and several others under a single unified approach, and provides formulas and algorithms for training and inference of the proposed models in the regression problem.

Warped Bayesian Linear Regression for Normative Modelling of Big Data

- Biology
- 2021

A novel framework based on Bayesian Linear Regression with likelihood warping is presented that allows normative modelling elegantly to big data cohorts and to correctly model non-Gaussian predictive distributions, and provides also likelihood-based statistics, which are useful for model selection.

MOGPTK: The Multi-Output Gaussian Process Toolkit

- Computer Science, MathematicsNeurocomputing
- 2021

The aim of this toolkit is to make multi-output GP (MOGP) models accessible to researchers, data scientists, and practitioners alike, and facilitates implementing the entire pipeline of GP modelling, including data loading, parameter initialization, model learning, parameter interpretation, up to data imputation and extrapolation.

Bayesian Reconstruction of Fourier Pairs

- Computer Science, MathematicsIEEE Trans. Signal Process.
- 2021

The aim is to address the lack of a principled treatment of data acquired indistinctly in the temporal and frequency domains in a way that is robust to missing or noisy observations, and that at the same time models uncertainty effectively.

Data-Driven Wireless Communication Using Gaussian Processes

- Computer ScienceArXiv
- 2021

This paper presents a promising family of nonparametric Bayesian machine learning methods, i.e., Gaussian processes (GPs), and their applications in wireless communication due to their interpretable learning ability with uncertainty, and reviews the distributed GP with promising scalability.

Gaussian Process Imputation of Multiple Financial Series

- Economics, Computer ScienceICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2020

This work focuses on learning the relationships among financial time series by modelling them through a multi-output Gaussian process (MOGP) with expressive covariance functions that is validated experimentally on two real-world financial datasets.

PATA: Probabilistic Active Task Acquisition

- 2019

Humans are remarkably well equipped to learn new concepts from only a few examples and to adapt quickly to unforeseen situations by leveraging experience. Meta-learning, a recent research field in…

Charting Brain Growth and Aging at High Spatial Precision

- Biology
- 2021

A reference cohort of neuroimaging data is assembled and normative modeling is used to characterize lifespan trajectories of cortical thickness and subcortical volume and it is shown that these models can be used to quantify variability underlying multiple disorders whilst also refining case-control inferences.

Appendix for ‘Transforming Gaussian Processes With Normalizing Flows’ Contents

- 2021

A Mathematical Appendix 2 A.1 Definitions and Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 A.2 Variational Lower Bound Derivation . . . . . . . . . . .…

Transformation-based generalized spatial regression using the spmoran package: Case study examples

- Mathematics
- 2021

This study presents application examples of generalized spatial regression modeling for count data and continuous non-Gaussian data using the spmoran package (version 0.2.2 onward). Section 2…

## References

SHOWING 1-10 OF 63 REFERENCES

Learning non-Gaussian Time Series using the Box-Cox Gaussian Process

- Computer Science, Mathematics2018 International Joint Conference on Neural Networks (IJCNN)
- 2018

This work training the model using derivative-free global-optimisation techniques and proposing a warping function based on the celebrated Box-Cox transformation that requires minimal numerical approximations—unlike existing warped GP models are validated.

Bayesian Warped Gaussian Processes

- Computer Science, MathematicsNIPS
- 2012

It is shown that it is possible to use a non-parametric nonlinear transformation in WGP and variationally integrate it out and the resulting Bayesian WGP is able to work in scenarios in which the maximum likelihood WGP failed.

Learning Stationary Time Series using Gaussian Processes with Nonparametric Kernels

- Computer Science, MathematicsNIPS
- 2015

A novel variational free-energy approach based on inter-domain inducing variables that efficiently learns the continuous-time linear filter and infers the driving white-noise process is developed, leading to new Bayesian nonparametric approaches to spectrum estimation.

Warped Gaussian Processes

- Computer ScienceNIPS
- 2003

We generalise the Gaussian process (GP) framework for regression by learning a nonlinear transformation of the GP outputs. This allows for non-Gaussian processes and non-Gaussian noise. The learning…

Design of Covariance Functions using Inter-Domain Inducing Variables

- 2015

We introduced the Gaussian Process Convolution Model (GPCM) in [1], a timeseries model for stationary signals based on the convolution between a continuoustime white-noise process and a…

Deep Gaussian Processes

- Mathematics, Computer ScienceAISTATS
- 2013

Deep Gaussian process (GP) models are introduced and model selection by the variational bound shows that a five layer hierarchy is justified even when modelling a digit data set containing only 150 examples.

Recovering Latent Signals From a Mixture of Measurements Using a Gaussian Process Prior

- Computer Science, MathematicsIEEE Signal Processing Letters
- 2017

The proposed Gaussian process mixture of measurements (GPMM), which models the latent signal as aGaussian process (GP) and allows us to perform Bayesian inference on such signal conditional to a set of noisy mixed measurements, is proposed.

Computationally Efficient Convolved Multiple Output Gaussian Processes

- Mathematics, Computer ScienceJ. Mach. Learn. Res.
- 2011

This paper presents different efficient approximations for dependent output Gaussian processes constructed through the convolution formalism, exploit the conditional independencies present naturally in the model and shows experimental results with synthetic and real data.

Spectral Mixture Kernels for Multi-Output Gaussian Processes

- Computer Science, MathematicsNIPS
- 2017

A parametric family of complex-valued cross-spectral densities is proposed and Cramer's Theorem is built on to provide a principled approach to design multivariate covariance functions.

Bayesian Nonparametric Spectral Estimation

- Computer Science, MathematicsNeurIPS
- 2018

A joint probabilistic model for signals, observations and spectra is proposed, where SE is addressed as an inference problem and Bayes' rule is applied to find the analytic posterior distribution of the spectrum given a set of observations.