Think Globally, Fit Locally : Unsupervised Learning of Low Dimensional Manifolds

Abstract

The problem of dimensionality reduction arises in many fields of information processing, including machine learning, data compression, scientific visualization, pattern recognition, and neural computation. Here we describe locally linear embedding (LLE), an unsupervised learning algorithm that computes low dimensional, neighborhood preserving embeddings of high dimensional data. The data, assumed to be sampled from an underlying manifold, are mapped into a single global coordinate system of lower dimensionality. The mapping is derived from the symmetries of locally linear reconstructions, and the actual computation of the embedding reduces to a sparse eigenvalue problem. Notably, the optimizations in LLE--though capable of generating highly nonlinear embeddings--are simple to implement, and they do not involve local minima. In this paper, we describe the implementation of the algorithm in detail and discuss several extensions that enhance its performance. We present results of the algorithm applied to data sampled from known manifolds, as well as to collections of images of faces, lips, and handwritten digits. These examples are used to provide extensive illustrations of the algorithm’s performance--both successes and failures--and to relate the algorithm to previous and ongoing work in nonlinear dimensionality reduction.

050100'03'05'07'09'11'13'15'17
Citations per Year

1,041 Citations

Semantic Scholar estimates that this publication has 1,041 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Saul2003ThinkGF, title={Think Globally, Fit Locally : Unsupervised Learning of Low Dimensional Manifolds}, author={Lawrence K. Saul and Sam T. Roweis and Yoram Singer}, year={2003} }