# The Renyi Gaussian Process: Towards Improved Generalization.

@article{Yue2020TheRG, title={The Renyi Gaussian Process: Towards Improved Generalization.}, author={Xubo Yue and Raed Kontar}, journal={arXiv: Machine Learning}, year={2020} }

We introduce an alternative closed form lower bound on the Gaussian process ($\mathcal{GP}$) likelihood based on the R\'enyi $\alpha$-divergence. This new lower bound can be viewed as a convex combination of the Nystr\"om approximation and the exact $\mathcal{GP}$. The key advantage of this bound, is its capability to control and tune the enforced regularization on the model and thus is a generalization of the traditional variational $\mathcal{GP}$ regression. From a theoretical perspective, we… CONTINUE READING

#### Topics from this paper.

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 44 REFERENCES

A Unifying Framework for Gaussian Process Pseudo-Point Approximations using Power Expectation Propagation

- Mathematics, Computer Science
- 2017

62

Deep Gaussian Processes for Regression using Approximate Expectation Propagation

- Mathematics, Computer Science
- 2016

126

A Robust Learning Approach for Regression Models Based on Distributionally Robust Optimization

- Computer Science, Mathematics
- 2018

29

A Unifying Framework of Anytime Sparse Gaussian Process Regression Models with Stochastic Variational Inference for Big Data

- Computer Science
- 2015

49

Approximate Inference for Fully Bayesian Gaussian Process Regression

- Computer Science, Mathematics
- 2019

1

Bayesian Inference and Learning in Gaussian Process State-Space Models with Particle MCMC

- Computer Science, Mathematics
- 2013

87

Concentration of tempered posteriors and of their variational approximations

- Mathematics, Computer Science
- 2017

37