Parametric Gaussian process regression for big data

@article{Raissi2019ParametricGP,
  title={Parametric Gaussian process regression for big data},
  author={Maziar Raissi},
  journal={Computational Mechanics},
  year={2019},
  pages={1-8}
}
  • M. Raissi
  • Published 11 April 2017
  • Computer Science
  • Computational Mechanics
This work introduces the concept of parametric Gaussian processes (PGP), which is built upon the seemingly self-contradictory idea of making Gaussian processes parametric. The resulting framework is capable of encoding massive amount of data into a small number of “hypothetical” data points. Moreover, parametric Gaussian processes are well aware of their imperfections and are capable of properly quantifying the uncertainty in their predictions associated with such limitations. The effectiveness… 
Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations
TLDR
The method circumvents the need for spatial discretization of the differential operators by proper placement of Gaussian process priors and is an attempt to construct structured and data-efficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data.
Forecasting of Commercial Sales with Large Scale Gaussian Processes
TLDR
This paper argues that there has not been enough discussion in the field of applications of Gaussian Process for the fast moving consumer goods industry, and shows value of this type of models as a decision-making tool for management.
Hidden physics models: Machine learning of nonlinear partial differential equations
Machine Learning of Space-Fractional Differential Equations
TLDR
This work provides a user-friendly and feasible way to perform fractional derivatives of kernels, via a unified set of d-dimensional Fourier integral formulas amenable to generalized Gauss-Laguerre quadrature.
The pitfalls of using Gaussian Process Regression for normative modeling
TLDR
It is shown that the uncertainty directly from Gaussian Processes Regression is irrelevant to the cohort heterogeneity in general.
The pitfalls of using Gaussian Process Regression for normative modeling
TLDR
It is shown that the uncertainty directly from Gaussian Processes Regression is irrelevant to the cohort heterogeneity in general.
Deep Hidden Physics Models: Deep Learning of Nonlinear Partial Differential Equations
  • M. Raissi
  • Computer Science
    J. Mach. Learn. Res.
  • 2018
TLDR
This work puts forth a deep learning approach for discovering nonlinear partial differential equations from scattered and potentially noisy observations in space and time by approximate the unknown solution as well as the nonlinear dynamics by two deep neural networks.
Machine Learning of Space-Fractional Differential Equations | SIAM Journal on Scientific Computing | Vol. 41, No. 4 | Society for Industrial and Applied Mathematics
TLDR
The proposed method has several benefits compared to previous works on data-driven discovery of differential equations; the user is not required to assume a “dictionary” of derivatives of various orders and directly controls the parsimony of the models being discovered.
Forward-Backward Stochastic Neural Networks: Deep Learning of High-dimensional Partial Differential Equations
  • M. Raissi
  • Computer Science, Mathematics
    ArXiv
  • 2018
TLDR
This work approximate the unknown solution by a deep neural network which essentially enables the author to benefit from the merits of automatic differentiation in partial differential equations.
Shared Gaussian Process Latent Variable Model for Incomplete Multiview Clustering
TLDR
A shared Gaussian process (GP) latent variable model for incomplete multiview clustering to gain the merits of two worlds by learning a set of intentionally aligned representative auxiliary points in individual views jointly to not only compensate for missing instances but also implement the group-level constraint.
...
1
2
3
...

References

SHOWING 1-10 OF 48 REFERENCES
Gaussian Processes for Big Data
TLDR
Stochastic variational inference for Gaussian process models is introduced and it is shown how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform Variational inference.
Distributed Gaussian Processes
TLDR
The robust Bayesian Committee Machine is introduced, a practical and scalable product-of-experts model for large-scale distributed GP regression and can be used on heterogeneous computing infrastructures, ranging from laptops to clusters.
Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
TLDR
A novel re-parametrisation of variational inference for sparse GP regression and latent variable models that allows for an efficient distributed algorithm and shows that GPs perform better than many common models often used for big data.
Local and global sparse Gaussian process approximations
TLDR
This paper develops a new sparse GP approximation which is a combination of both the global and local approaches, and shows that it is derived as a natural extension of the framework developed by Quinonero Candela and Rasmussen for sparse GP approximations.
Variational Fourier Features for Gaussian Processes
TLDR
This work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances, and derives these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures.
Sparse On-Line Gaussian Processes
TLDR
An approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets is developed based on a combination of a Bayesian on-line algorithm and a sequential construction of a relevant subsample of data that fully specifies the prediction of the GP model.
Scalable transformed additive signal decomposition by non-conjugate Gaussian process inference
TLDR
This work extends methods on Generalized Additive Models to the additive GP case, thus achieving scalable marginal posterior inference over each latent function in settings such as those above.
Fast Forward Selection to Speed Up Sparse Gaussian Process Regression
TLDR
A method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection, which leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically.
Bayesian Gaussian Process Latent Variable Model
TLDR
A variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction and the maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space.
Gaussian Processes for Machine Learning
TLDR
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
...
1
2
3
4
5
...