Spectral Properties of the Alignment Matrices in Manifold Learning

@article{Zha2009SpectralPO,
  title={Spectral Properties of the Alignment Matrices in Manifold Learning},
  author={Hongyuan Zha and Zhenyue Zhang},
  journal={SIAM Rev.},
  year={2009},
  volume={51},
  pages={545-566}
}
Local methods for manifold learning generate a collection of local parameterizations which is then aligned to produce a global parameterization of the underlying manifold. The alignment procedure is carried out through the computation of a partial eigendecomposition of a so-called alignment matrix. In this paper, we present an analysis of the eigenstructure of the alignment matrix giving both necessary and sufficient conditions under which the null space of the alignment matrix recovers the… 

Local tangent space alignment via nuclear norm regularization for incomplete data

Adaptive Manifold Learning

TLDR
Algorithms are developed that address two key issues in manifold learning: the adaptive selection of the local neighborhood sizes when imposing a connectivity structure on the given set of high-dimensional data points and the adaptive bias reduction in the local low-dimensional embedding by accounting for the variations in the curvature of the manifold.

Tangent space estimation for smooth embeddings of Riemannian manifolds

TLDR
This work proposes a theoretical analysis of local sampling conditions for the estimation of the tangent space at a point P lying on a m-dimensional Riemannian manifold S in R^n and observes that theLocal sampling conditions are highly dependent on the correlation between the components in the second-order local approximation of the manifold.

Robust local tangent space alignment via iterative weighted PCA

Manifold learning in local tangent space via extreme learning machine

Robust Local Tangent Space Alignment

TLDR
A robust version of LTSA called RLTSA is proposed, which makes LTSA more robust from three aspects: robust PCA algorithm is used instead of the standard SVD to reduce influence of noise on local tangent space coordinates, and R LTSA chooses neighborhoods that are approximated well by the local tangENT space coordinates to align with the global coordinates.

Perturbation Bounds for Procrustes, Classical Scaling, and Trilateration, with Applications to Manifold Learning

TLDR
P perturbation bounds for classical scaling and trilateration are obtained and applied to derive performance bounds for Isomap, Landmark Isomaps, and Maximum Variance Unfolding and a new perturbations bound for procrustes analysis plays a key role.

SAR Target Recognition via Local Sparse Representation of Multi-Manifold Regularized Low-Rank Approximation

TLDR
A low-dimensional representation model via incorporating multi-manifold regularization term into the low-rank matrix factorization framework and local sparse representation is proposed for classification to improve the discriminative ability of target recognition under EOCs.

Discrete Hessian Eigenmaps method for dimensionality reduction

References

SHOWING 1-10 OF 20 REFERENCES

Spectral analysis of alignment in manifold learning

  • H. ZhaZhenyue Zhang
  • Computer Science
    Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005.
  • 2005
TLDR
This paper presents an analysis of the eigenstructure of the alignment matrix giving both necessary and sufficient conditions under which the null space of the aligned matrix recovers the global coordinate system.

Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment

We present a new algorithm for manifold learning and nonlinear dimensionality reduction. Based on a set of unorganized da-ta points sampled with noise from a parameterized manifold, the local

From Subspace to Submanifold Methods

TLDR
Some of the obstacles to ”industrial-strength” manifold modeling are discussed, and some methods that offer significant improvements in robustness and accuracy are introduced which eschew the common assumptions about the manifold geometry.

Charting a Manifold

  • M. Brand
  • Mathematics, Computer Science
    NIPS
  • 2002
TLDR
A nonlinear mapping from a high-dimensional sample space to a low-dimensional vector space is constructed, effectively recovering a Cartesian coordinate system for the manifold from which the data is sampled, and is pseudo-invertible.

Conical dimension as an intrisic dimension estimator and its applications

TLDR
This paper introduces a novel local intrinsic dimension estimator, conical dimension, for estimating the intrinsic dimension of a data set consisting of points lying in the proximity of a manifold, and develops algorithms for computing this dimension paying special attention to the numerical robustness issues.

A global geometric framework for nonlinear dimensionality reduction.

TLDR
An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure.

Semi-supervised nonlinear dimensionality reduction

TLDR
The sensitivity analysis of the algorithms shows that prior information will improve stability of the solution, and some insight is given on what kind of prior information best improves the solution.

Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering

TLDR
The algorithm provides a computationally efficient approach to nonlinear dimensionality reduction that has locality preserving properties and a natural connection to clustering.

A Domain Decomposition Method for Fast Manifold Learning

TLDR
A fast manifold learning algorithm based on the methodology of domain decomposition is proposed that can glue the embeddings on the two subdomains into an embedding on the whole domain.

Nonlinear dimensionality reduction by locally linear embedding.

TLDR
Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.