Competing with Gaussian linear experts


We study the problem of online regression. We do not make any assumptions about input vectors or outcomes. We prove a theoretical bound on the square loss of Ridge Regression. We also show that Bayesian Ridge Regression can be thought of as an online algorithm competing with all the Gaussian linear experts. We then consider the case of infinite-dimensional Hilbert spaces and prove relative loss bounds for the popular non-parametric kernelized Bayesian Ridge Regression and kernelized Ridge Regression. Our main theoretical guarantees have the form of equalities.

Extracted Key Phrases

Cite this paper

@article{Zhdanov2009CompetingWG, title={Competing with Gaussian linear experts}, author={Fedor Zhdanov and Vladimir Vovk}, journal={CoRR}, year={2009}, volume={abs/0910.4683} }