# Incremental Manifold Learning Via Tangent Space Alignment

@inproceedings{Liu2006IncrementalML, title={Incremental Manifold Learning Via Tangent Space Alignment}, author={Xiaoming Liu and Jianwei Yin and Zhilin Feng and Jinxiang Dong}, booktitle={IAPR International Workshop on Artificial Neural Networks in Pattern Recognition}, year={2006} }

Several algorithms have been proposed to analysis the structure of high-dimensional data based on the notion of manifold learning. They have been used to extract the intrinsic characteristic of different type of high-dimensional data by performing nonlinear dimensionality reduction. Most of them operate in a “batch” mode and cannot be efficiently applied when data are collected sequentially. In this paper, we proposed an incremental version (ILTSA) of LTSA (Local Tangent Space Alignment), which…

## 23 Citations

### A Manifold Learning Algorithm Based on Incremental Tangent Space Alignment

- Computer ScienceICCCS
- 2016

A new manifold Learning algorithm based on Incremental Tangent Space Alignment, LITSA for short is proposed that can achieve a more accurate low-dimensional representation of the data than state-of-the-art incremental algorithms.

### An improved incremental nonlinear dimensionality reduction for isometric data embedding

- Computer ScienceInf. Process. Lett.
- 2015

### Incremental manifold learning by spectral embedding methods

- Computer SciencePattern Recognit. Lett.
- 2011

### Incremental Laplacian eigenmaps by preserving adjacent information between data points

- Computer SciencePattern Recognit. Lett.
- 2009

### Incremental Construction of Low-Dimensional Data Representations

- Computer ScienceANNPR
- 2016

Incremental version of the Grassmann&Stiefel Eigenmaps manifold learning algorithm, which has asymptotically minimal reconstruction error, is proposed in this paper and has significantly smaller computational complexity in contrast to the initial algorithm.

### Manifold learning: Dimensionality reduction and high dimensional data reconstruction via dictionary learning

- Computer ScienceNeurocomputing
- 2016

### A Dictionary-Based Algorithm for Dimensionality Reduction and Data Reconstruction

- Computer Science2014 22nd International Conference on Pattern Recognition
- 2014

A dictionary-based algorithm to deal with the out-of-sample extension problem for large-scale DR task and it is shown that, for both dimensionality reduction and data reconstruction, the algorithm is accurate and fast.

### The Estimation Algorithm of Laplacian Eigenvalues for the Tangent Bundle

- Computer Science2011 Seventh International Conference on Computational Intelligence and Security
- 2011

An estimation algorithm of Laplacian Eigenvalues is presented and the LE algorithm's advantage of maintaining the local geometry to remaining globally through the global coordinates of the tangent bundle is expanded.

### Learning to detect concepts with Approximate Laplacian Eigenmaps in large-scale and online settings

- Computer ScienceInternational Journal of Multimedia Information Retrieval
- 2015

It is demonstrated that Approximate Laplacian Eigenmaps, which constitute a latent representation of the manifold underlying a set of images, offer a compact yet effective feature representation for the problem of concept detection.

### Information preserving and locally isometric&conformal embedding via Tangent Manifold Learning

- Computer Science, Mathematics2015 IEEE International Conference on Data Science and Advanced Analytics (DSAA)
- 2015

A new geometrically motivated locally isometric and conformal representation method is proposed, which employs Tangent Manifold Learning technique consisting in sample-based estimation of tangent spaces to the unknown Data manifold.

## References

SHOWING 1-10 OF 14 REFERENCES

### Incremental nonlinear dimensionality reduction by manifold learning

- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2006

An incremental version of ISOMAP, one of the key manifold learning algorithms, is described and it is demonstrated that this modified algorithm can maintain an accurate low-dimensional representation of the data in an efficient manner.

### Selecting Landmark Points for Sparse Manifold Learning

- Computer ScienceNIPS
- 2005

This paper presents an algorithm for selecting landmarks, based on LASSO regression, which is well known to favor sparse approximations because it uses regularization with an l1 norm, and a continuous manifold parameterization is found.

### Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment

- Computer Science, Mathematics
- 2004

We present a new algorithm for manifold learning and nonlinear dimensionality reduction. Based on a set of unorganized da-ta points sampled with noise from a parameterized manifold, the local…

### Nonlinear dimensionality reduction by locally linear embedding.

- Computer ScienceScience
- 2000

Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.

### A global geometric framework for nonlinear dimensionality reduction.

- Computer ScienceScience
- 2000

An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure.

### Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data

- Computer Science, MathematicsProceedings of the National Academy of Sciences of the United States of America
- 2003

The Hessian-based locally linear embedding method for recovering the underlying parametrization of scattered data (mi) lying on a manifold M embedded in high-dimensional Euclidean space is described, where the isometric coordinates can be recovered up to a linear isometry.

### A Domain Decomposition Method for Fast Manifold Learning

- Computer ScienceNIPS
- 2005

A fast manifold learning algorithm based on the methodology of domain decomposition is proposed that can glue the embeddings on the two subdomains into an embedding on the whole domain.

### Least angle regression

- Computer Science
- 2004

A publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates is described.

### Numerical methods for computing angles between linear subspaces

- MathematicsMilestones in Matrix Computation
- 2007

Experimental results are given, which indicates that MGS gives $\theta_k$ with equal precision and fewer arithmetic operations than HT, however, HT gives principal vectors, which are orthogonal to working accuracy, which is not in general true for MGS.