Parametric Gaussian process regression for big data
@article{Raissi2019ParametricGP, title={Parametric Gaussian process regression for big data}, author={Maziar Raissi}, journal={Computational Mechanics}, year={2019}, pages={1-8} }
This work introduces the concept of parametric Gaussian processes (PGP), which is built upon the seemingly self-contradictory idea of making Gaussian processes parametric. The resulting framework is capable of encoding massive amount of data into a small number of “hypothetical” data points. Moreover, parametric Gaussian processes are well aware of their imperfections and are capable of properly quantifying the uncertainty in their predictions associated with such limitations. The effectiveness…
27 Citations
Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations
- Computer Science, MathematicsSIAM J. Sci. Comput.
- 2018
The method circumvents the need for spatial discretization of the differential operators by proper placement of Gaussian process priors and is an attempt to construct structured and data-efficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data.
Forecasting of Commercial Sales with Large Scale Gaussian Processes
- Computer Science2017 IEEE International Conference on Data Mining Workshops (ICDMW)
- 2017
This paper argues that there has not been enough discussion in the field of applications of Gaussian Process for the fast moving consumer goods industry, and shows value of this type of models as a decision-making tool for management.
Hidden physics models: Machine learning of nonlinear partial differential equations
- Computer ScienceJ. Comput. Phys.
- 2018
Machine Learning of Space-Fractional Differential Equations
- Computer Science, MathematicsSIAM J. Sci. Comput.
- 2019
This work provides a user-friendly and feasible way to perform fractional derivatives of kernels, via a unified set of d-dimensional Fourier integral formulas amenable to generalized Gauss-Laguerre quadrature.
The pitfalls of using Gaussian Process Regression for normative modeling
- Computer SciencebioRxiv
- 2021
It is shown that the uncertainty directly from Gaussian Processes Regression is irrelevant to the cohort heterogeneity in general.
The pitfalls of using Gaussian Process Regression for normative modeling
- Computer SciencePloS one
- 2021
It is shown that the uncertainty directly from Gaussian Processes Regression is irrelevant to the cohort heterogeneity in general.
Deep Hidden Physics Models: Deep Learning of Nonlinear Partial Differential Equations
- Computer ScienceJ. Mach. Learn. Res.
- 2018
This work puts forth a deep learning approach for discovering nonlinear partial differential equations from scattered and potentially noisy observations in space and time by approximate the unknown solution as well as the nonlinear dynamics by two deep neural networks.
Machine Learning of Space-Fractional Differential Equations | SIAM Journal on Scientific Computing | Vol. 41, No. 4 | Society for Industrial and Applied Mathematics
- Computer Science, Mathematics
- 2019
The proposed method has several benefits compared to previous works on data-driven discovery of differential equations; the user is not required to assume a “dictionary” of derivatives of various orders and directly controls the parsimony of the models being discovered.
Forward-Backward Stochastic Neural Networks: Deep Learning of High-dimensional Partial Differential Equations
- Computer Science, MathematicsArXiv
- 2018
This work approximate the unknown solution by a deep neural network which essentially enables the author to benefit from the merits of automatic differentiation in partial differential equations.
Shared Gaussian Process Latent Variable Model for Incomplete Multiview Clustering
- Computer ScienceIEEE Transactions on Cybernetics
- 2020
A shared Gaussian process (GP) latent variable model for incomplete multiview clustering to gain the merits of two worlds by learning a set of intentionally aligned representative auxiliary points in individual views jointly to not only compensate for missing instances but also implement the group-level constraint.
References
SHOWING 1-10 OF 48 REFERENCES
Gaussian Processes for Big Data
- Computer ScienceUAI
- 2013
Stochastic variational inference for Gaussian process models is introduced and it is shown how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform Variational inference.
Distributed Gaussian Processes
- Computer ScienceICML
- 2015
The robust Bayesian Committee Machine is introduced, a practical and scalable product-of-experts model for large-scale distributed GP regression and can be used on heterogeneous computing infrastructures, ranging from laptops to clusters.
Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
- Computer ScienceNIPS
- 2014
A novel re-parametrisation of variational inference for sparse GP regression and latent variable models that allows for an efficient distributed algorithm and shows that GPs perform better than many common models often used for big data.
Local and global sparse Gaussian process approximations
- Computer ScienceAISTATS
- 2007
This paper develops a new sparse GP approximation which is a combination of both the global and local approaches, and shows that it is derived as a natural extension of the framework developed by Quinonero Candela and Rasmussen for sparse GP approximations.
Variational Fourier Features for Gaussian Processes
- Computer ScienceJ. Mach. Learn. Res.
- 2017
This work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances, and derives these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures.
Sparse On-Line Gaussian Processes
- Computer ScienceNeural Computation
- 2002
An approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets is developed based on a combination of a Bayesian on-line algorithm and a sequential construction of a relevant subsample of data that fully specifies the prediction of the GP model.
Scalable transformed additive signal decomposition by non-conjugate Gaussian process inference
- Computer Science2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP)
- 2016
This work extends methods on Generalized Additive Models to the additive GP case, thus achieving scalable marginal posterior inference over each latent function in settings such as those above.
Fast Forward Selection to Speed Up Sparse Gaussian Process Regression
- Computer ScienceAISTATS
- 2003
A method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection, which leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically.
Bayesian Gaussian Process Latent Variable Model
- Computer ScienceAISTATS
- 2010
A variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction and the maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space.
Gaussian Processes for Machine Learning
- Computer ScienceAdaptive computation and machine learning
- 2009
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.