Flexible Manifold Embedding: A Framework for Semi-Supervised and Unsupervised Dimension Reduction

@article{Nie2010FlexibleME,
  title={Flexible Manifold Embedding: A Framework for Semi-Supervised and Unsupervised Dimension Reduction},
  author={Feiping Nie and Dong Xu and Ivor Wai-Hung Tsang and Changshui Zhang},
  journal={IEEE Transactions on Image Processing},
  year={2010},
  volume={19},
  pages={1921-1932}
}
We propose a unified manifold learning framework for semi-supervised and unsupervised dimension reduction by employing a simple but effective linear regression function to map the new data points. For semi-supervised dimension reduction, we aim to find the optimal prediction labels F for all the training samples X, the linear regression function h(X) and the regression residue F0 = F - h(X) simultaneously. Our new objective function integrates two terms related to label fitness and manifold… 
Accelerating Flexible Manifold Embedding for Scalable Semi-Supervised Learning
TLDR
This paper addresses the problem of large-scale graph-based semi-supervised learning for multi-class classification by combining a regression residue term and a manifold smoothness term jointly, which naturally provides the prediction model for handling unseen samples.
Kernel flexible manifold embedding for pattern classification
TLDR
The proposed Kernel version of the Flexible Manifold Embedding can outperform FME as well as many state-of-the-art semi-supervised learning methods.
Adaptive Loss Minimization for Semi-Supervised Elastic Embedding
TLDR
A novel adaptive loss minimization method that takes the advantages of both L1 norm and L2 norm, and is robust to the data outlier under Laplacian distribution and can efficiently learn the normal data under Gaussian distribution is studied.
Semi-Supervised Dimension Reduction Using Trace Ratio Criterion
  • Yi Huang, Dong Xu, F. Nie
  • Mathematics, Computer Science
    IEEE Transactions on Neural Networks and Learning Systems
  • 2012
TLDR
A flexible regularizer is introduced which models the regression residual into the reformulated objective function of the recent work semi-supervised discriminant analysis in a TR form and develops an iterative algorithm to simultaneously solve for the low-dimensional data representation F and the projection matrix W.
Sparse feature space representation: A unified framework for semi-supervised and domain adaptation learning
TLDR
This paper proposes a novel multi-source adaptation learning framework based on Sparse Feature Space Representation (SFSR), or called SFSR-MSAL, which is universal and can be easily degraded into semi-supervised learning by just tuning the regularization parameter.
Learning from normalized local and global discriminative information for semi-supervised regression and dimensionality reduction
TLDR
This paper shows that SDA and LapRLS can be unified into a constrained manifold regularized least square framework and introduces a new and effective semi-supervised dimensionality reduction method, called Learning from Local and Global Information (LLGDI).
Semi-Supervised Classifications via Elastic and Robust Embedding
TLDR
An efficient optimization algorithm is proposed to solve a more general problem, based on which the optimal solution to the derived problem is found and a non-squared loss is used instead of the traditional squared loss to learn a robust model.
Enhanced low-rank representation via sparse manifold adaption for semi-supervised learning
TLDR
This paper proposes an enhanced LRR via sparse manifold adaption, termed manifold low-rank representation (MLRR), to learn low- rank data representation and incorporates a regularizer into LRR to make the learned coefficients preserve the geometric constraints revealed in the data space.
Learning Flexible Graph-Based Semi-Supervised Embedding
TLDR
A graph-based semi-supervised embedding method as well as its kernelized version for generic classification and recognition tasks and has an obvious advantage that the learnt subspace has a direct out-of-sample extension to novel samples, and are thus easily generalized to the entire high-dimensional input space.
Image classification using kernel flexible manifold embedding
In this paper we propose a kernelized version of the Flexible Manifold Embedding (FME) framework. This latter has been recently proposed as a semi-supervised graph-based label propagation method that
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 48 REFERENCES
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
TLDR
A semi-supervised framework that incorporates labeled and unlabeled data in a general-purpose learner is proposed and properties of reproducing kernel Hilbert spaces are used to prove new Representer theorems that provide theoretical basis for the algorithms.
Semi-Supervised Bilinear Subspace Learning
TLDR
A new semi-supervised subspace learning algorithm by integrating the tensor representation and the complementary information conveyed by unlabeled data is presented, and it is demonstrated that ARSDA/T brings significant improvement in face recognition accuracy over both conventional supervised and semi- supervised sub space learning algorithms.
Large-Scale Sparsified Manifold Regularization
TLDR
This paper integrates manifold regularization with the core vector machine and produces sparse solutions with low time and space complexities by using a sparsified manifold regularizer and formulating as a center-constrained minimum enclosing ball problem.
Discriminant Locally Linear Embedding With High-Order Tensor Data
TLDR
This work proposes a new manifold learning technique called discriminant locally linear embedding (DLLE), in which the local geometric properties within each class are preserved according to the locally linear embeddedding (LLE) criterion, and the separability between different classes is enforced by maximizing margins between point pairs on different classes.
Beyond the point cloud: from transductive to semi-supervised learning
TLDR
This paper constructs a family of data-dependent norms on Reproducing Kernel Hilbert Spaces (RKHS) that allow the structure of the RKHS to reflect the underlying geometry of the data.
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
TLDR
A new supervised dimensionality reduction algorithm called marginal Fisher analysis is proposed in which the intrinsic graph characterizes the intraclass compactness and connects each data point with its neighboring points of the same class, while the penalty graph connects the marginal points and characterizing the interclass separability.
Transductive Component Analysis
TLDR
Under semisupervised settings, the objective is to learn a smooth as well as discriminative subspace and linear dimensionality reduction is thus achieved by mapping all samples into the subspace by the transductive component analysis algorithm.
Semi-Supervised Learning Based on Semiparametric Regularization
TLDR
This paper addresses the semi-supervised learning problem by developing a semiparametric regularization based approach, which attempts to discover the marginal distribution of the data to learn the parametric function through exploiting the geometric distribution ofThe data.
Discriminative Locality Alignment
TLDR
A new algorithm, termed Discriminative Locality Alignment (DLA), is proposed, which operates in the following three stages: first, in part optimization, discriminative information is imposed over patches, each of which is associated with one sample and its neighbors; then, in sample weighting, each part optimization is weighted by the margin degree, a measure of the importance of a given sample.
Semi-supervised orthogonal discriminant analysis via label propagation
TLDR
This paper proposes a novel semi-supervised orthogonal discriminant analysis via label propagation that propagates the label information from the labeled data to the unlabeled data through a specially designed label propagation, and thus the distribution of the unl labeled data can be explored more effectively to learn a better subspace.
...
1
2
3
4
5
...