Learn More
We develop an approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a combination of a Bayesian on-line algorithm, together with a sequential construction of a relevant subsample of the data that fully specifies the(More)
This copy of the thesis has been supplied on condition that anyone who consults it is understood to recognise that its copyright rests with its author and that no quotation from the thesis and no information derived from it may be published without proper acknowledgement. Thesis Summary In recent years there has been an increased interest in applying(More)
We present three simple approximations for the calculation of the posterior mean in Gaussian Process classification. The first two methods are related to mean field ideas known in Statistical Physics. The third approach is based on Bayesian online approach which was motivated by recent results in the Statistical Mechanics of Neural Networks. We present(More)
Robot manipulation tasks require on robot models. When exact physical parameters of the robot are not available, learning robot models from data becomes an appealing alternative. Most learning approaches are formulated in a supervised learning framework and are based on clearly defined training sets. We propose a method that improves the learning process by(More)
Gradient based policy search algorithms benefit largely from the availability of a properly estimated state or state-action value function which can be used to reduce the variance of the gradient estimates. Additionally the use of Gaussian processes for value function approximation provides a fully probabilistic model where - using the uncertainty in the(More)
In recent years several models have been proposed for text categorization. Within this, one of the widely applied models is the vector space model (VSM), where independence between indexing terms, usually words, is assumed. Since training corpora sizes are relatively small - compared to ap infin what would be required for a realistic number of words - the(More)
Gaussian process (GP) priors have been successfully used in non-parametric Bayesian regression and classification models. Inference can be performed analytically only for the regression model with Gaussian noise. For all other likelihood models inference is intractable and various approximation techniques have been proposed. In recent years(More)