Corpus ID: 211146740

The Renyi Gaussian Process: Towards Improved Generalization.

@article{Yue2020TheRG,
  title={The Renyi Gaussian Process: Towards Improved Generalization.},
  author={Xubo Yue and Raed Kontar},
  journal={arXiv: Machine Learning},
  year={2020}
}
  • Xubo Yue, Raed Kontar
  • Published 2020
  • Mathematics, Computer Science
  • arXiv: Machine Learning
  • We introduce an alternative closed form lower bound on the Gaussian process ($\mathcal{GP}$) likelihood based on the R\'enyi $\alpha$-divergence. This new lower bound can be viewed as a convex combination of the Nystr\"om approximation and the exact $\mathcal{GP}$. The key advantage of this bound, is its capability to control and tune the enforced regularization on the model and thus is a generalization of the traditional variational $\mathcal{GP}$ regression. From a theoretical perspective, we… CONTINUE READING

    Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 44 REFERENCES
    A Unifying Framework for Gaussian Process Pseudo-Point Approximations using Power Expectation Propagation
    62
    Alpha-Beta Divergence For Variational Inference
    11
    Bayesian Gaussian Process Latent Variable Model
    321