Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models

Abstract

Summarising a high dimensional data set with a low dimensional embedding is a standard approach for exploring its structure. In this paper we provide an overview of some existing techniques for discovering such embeddings. We then introduce a novel probabilistic interpretation of principal component analysis (PCA) that we term dual probabilistic PCA (DPPCA). The DPPCA model has the additional advantage that the linear mappings from the embedded space can easily be non-linearised through Gaussian processes. We refer to this model as a Gaussian process latent variable model (GP-LVM). Through analysis of the GP-LVM objective function, we relate the model to popular spectral techniques such as kernel PCA and multidimensional scaling. We then review a practical algorithm for GP-LVMs in the context of large data sets and develop it to also handle discrete valued data and missing attributes. We demonstrate the model on a range of real-world and artificially generated data sets.

Extracted Key Phrases

Showing 1-10 of 40 references

Multidimensional scaling: I. theory and method

  • Warren S Torgerson
  • 1952
Highly Influential
19 Excerpts

Gaussian process models for visualisation of high dimensional data

  • Neil D Lawrence
  • 2004
1 Excerpt
Showing 1-10 of 349 extracted citations
050100'04'06'08'10'12'14'16
Citations per Year

694 Citations

Semantic Scholar estimates that this publication has received between 568 and 847 citations based on the available data.

See our FAQ for additional information.