#### Filter Results:

- Full text PDF available (129)

#### Publication Year

1991

2018

- This year (5)
- Last 5 years (23)
- Last 10 years (46)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

Learn More

- Carl E. Rasmussen, Christopher K. I. Williams
- Adaptive computation and machine learning
- 2009

Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received growing attention in the machine learning community over the past… (More)

- Mark Everingham, Luc Van Gool, Christopher K. I. Williams, John M. Winn, Andrew Zisserman
- International Journal of Computer Vision
- 2009

The Pascal Visual Object Classes (VOC) challenge is a benchmark in visual object category recognition and detection, providing the vision and machine learning communities with a standard dataset of… (More)

- Mark Everingham, S. M. Ali Eslami, Luc Van Gool, Christopher K. I. Williams, John M. Winn, Andrew Zisserman
- International Journal of Computer Vision
- 2014

The Pascal Visual Object Classes (VOC) challenge consists of two components: (i) a publicly available dataset of images together with ground truth annotation and standardised evaluation software; and… (More)

- Christopher M. Bishop, Markus Svensén, Christopher K. I. Williams
- Neural Computation
- 1998

Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis,… (More)

In this paper we investigate multi-task learning in the context of Gaussian Processes (GP). We propose a model that learns a shared covariance function on input-dependent features and a “free-form”… (More)

- Christopher K. I. Williams, David Barber
- IEEE Trans. Pattern Anal. Mach. Intell.
- 1998

We consider the problem of assigning an input vector to one of m classes by predicting P(c|x) for c = 1, o, m. For a twoclass problem, the probability of class one given x is estimated by s(y(x)),… (More)

The Bayesian analysis of neural networks is difficult because a simple prior over weights implies a complex prior distribution over functions . In this paper we investigate the use of Gaussian… (More)

The main aim of this paper is to provide a tutorial on regression with Gaussian processes We start from Bayesian linear regression and show how by a change of viewpoint one can see this method as a… (More)

- Matthias W. Seeger, Christopher K. I. Williams, Neil D. Lawrence
- AISTATS
- 2003

We present a method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection. Our method is essentially as fast as an… (More)