#### Filter Results:

- Full text PDF available (88)

#### Publication Year

1948

2017

- This year (1)
- Last 5 years (15)
- Last 10 years (34)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

#### Organism

Learn More

- Christopher M. Bishop, Nasser M. Nasrabadi
- J. Electronic Imaging
- 2007

his book provides an introduction to the eld of pattern recognition and machine earning. It gives an overview of several asic and advanced topics in machine earning theory. The book is definitely aluable to scientists and engineers who re involved in developing machine learnng tools applied to signal and image proessing applications. This book is also… (More)

Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor… (More)

- Christopher M. Bishop, Markus Svensén, Christopher K. I. Williams
- Neural Computation
- 1998

Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis, which is based on a linear transformation between the latent space and the data space. In this article, we introduce a form of nonlinear latent variable model… (More)

- M E Tipping, C M Bishop
- Neural computation
- 1999

Principal component analysis (PCA) is one of the most popular techniques for processing, compressing, and visualizing data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However,… (More)

- Michael E. Tipping, Christopher M. Bishop
- Neural Computation
- 1999

Principal component analysis (PCA) is one of the most popular techniques for processing, compressing and visualising data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However,… (More)

- John M. Winn, Christopher M. Bishop
- Journal of Machine Learning Research
- 2005

Bayesian inference is now widely established as one of the principal foundations for machine learning. In practice, exact inference is rarely possible, and so a variety of approximation techniques have been developed, one of the most widely used being a deterministic framework called variational inference. In this paper we introduce Variational Message… (More)

One of the central issues in the use of principal component analysis (PCA) for data modelling is that of choosing the appropriate number of retained components. This problem was recently addressed through the formulation of a Bayesian treatment of PCA (Bishop, 1999a) in terms of a probabilistic latent variable model. A central feature of this approach is… (More)

- Christopher M. Bishop
- Information science and statistics
- 2007

The Support Vector Machine (SVM) of Vapnik [9] has become widely established as one of the leading approaches to pattern recognition and machine learning. It expresses predictions in terms of a linear combination of kernel functions centred on a subset of the training data, known as support vectors. Despite its widespread success, the SVM suffers from some… (More)

It is well known that the addition of noise to the input data of a neural network during training can, in some circumstances, lead to significant improvements in generalization performance. Previous work has shown that such training with noise is equivalent to a form of regularization in which an extra term is added to the error function. However, the… (More)