Nonlinear Dimension Reduction via Local Tangent Space Alignment

  title={Nonlinear Dimension Reduction via Local Tangent Space Alignment},
  author={Zhenyue Zhang and Hongyuan Zha},
In this paper we present a new algorithm for manifold learning and nonlinear dimension reduction. Based on a set of unorganized data points sampled with noise from the manifold, we represent the local geometry of the manifold using tangent spaces learned by fitting an affine subspace in a neighborhood of each data point. Those tangent spaces are aligned to give the internal global coordinates of the data points with respect to the underlying manifold by way of a partial eigendecomposition of… 
Tangent Bundle Manifold Learning via Grassmann&Stiefel Eigenmaps
This work proposes an amplification of the ML, called Tangent Bundle ML, in which the proximity not only between the original manifold and its estimator but also between their tangent spaces is required.
Robust Local Tangent Space Alignment
A robust version of LTSA called RLTSA is proposed, which makes LTSA more robust from three aspects: robust PCA algorithm is used instead of the standard SVD to reduce influence of noise on local tangent space coordinates, and R LTSA chooses neighborhoods that are approximated well by the local tangENT space coordinates to align with the global coordinates.
Active Neighborhood Selection for Locally Linear Embedding
  • Xiumin YuHongyu Li
  • Computer Science
    2009 Second International Symposium on Knowledge Acquisition and Modeling
  • 2009
Experimental results demonstrate that metric LLE usually performs better than LLE in feature extraction, and the strategy of active neighborhood selection to extend LLE is made use.
Neighborhood smoothing embedding for noisy manifold learning
This paper proposes neighbor smoothing embedding (NSE) for noisy points sampled from a nonlinear manifold based on LLE and local linear surface estimator, which smoothes the neighbors of each sample data point and then computes the reconstruction matrix of the projections on the estimated surface.
In this paper, we develop methods for outlier removal and noise reduction based on weighted local linear smoothing for a set of noisy points sampled from a nonlinear manifold. The methods can be used
Matrix perturbation analysis of local tangent space alignment
Analysis of an alignment algorithm for nonlinear dimensionality reduction
An analysis of the alignment process is presented, giving conditions under which the null space of the aligned matrix recovers the global coordinate system up to an affine transformation, and it is shown that Local Tangent Space Alignment method (LTSA) can recover a locally isometric embedding up to a rigid motion.
Data-based Manifold Reconstruction via Tangent Bundle Manifold Learning
A new geometrically motivated method for the TBML problem in which the manifold, its tangent spaces and lowdimensional representation accurately reconstructed from a sample is presented.
Iterative Hyperplane Merging: A Framework for Manifold Learning
A Minimum Spanning Tree provides the skeleton needed to traverse the manifold so that the local hyperplanes can be used to build a global, locally stable, embedding.


Efficient Simplicial Reconstructions of Manifolds from Their Samples
  • D. Freedman
  • Mathematics
    IEEE Trans. Pattern Anal. Mach. Intell.
  • 2002
An algorithm for manifold learning is presented and an important property of the algorithm is that its complexity depends on the dimension of the manifold, rather than that of the embedding space.
Nonlinear dimensionality reduction by locally linear embedding.
Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.
Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering.
Global Coordination of Local Linear Models
The regularizer takes the form of a Kullback-Leibler divergence and illustrates an unexpected application of variational methods: not to perform approximate inference in intractable probabilistic models, but to learn more useful internal representations in tractable ones.
Grouping and dimensionality reduction by locally linear embedding
A variant of LLE that can simultaneously group the data and calculate local embedding of each group is studied, and an estimate for the upper bound on the intrinsic dimension of the data set is obtained automatically.
Stochastic Neighbor Embedding
This probabilistic framework makes it easy to represent each object by a mixture of widely separated low-dimensional images, which allows ambiguous objects, like the document count vector for the word "bank", to have versions close to the images of both "river" and "finance" without forcing the image of outdoor concepts to be located close to those of corporate concepts.
Think globally, fit locally: unsupervised l earning of non-linear manifolds
A construction member having a tubular body with side walls surrounding a passage through the body. Each side wall has an opening and a plurality of notches for accommodating a connecting element.
Topology representing networks
Think globally...
In an era of globalization, Google Earth and transcontinental air travel, all of us should learn a little about spherical geometry and its modern generalization, differential geometry, which underpins such imposing intellectual edifices as Einstein's general theory of relativity.
Applied Functional Data Analysis
This book is a great book for a Ž rst graduate course in multivariate analysis, because it covers the standard topics in “classical” normal theory approach to multivariateAnalysis.