Structural Laplacian Eigenmaps for Modeling Sets of Multivariate Sequences

@article{Lewandowski2014StructuralLE,
  title={Structural Laplacian Eigenmaps for Modeling Sets of Multivariate Sequences},
  author={Michal Lewandowski and D. Makris and S. Velastin and Jean-Christophe Nebel},
  journal={IEEE Transactions on Cybernetics},
  year={2014},
  volume={44},
  pages={936-949}
}
A novel embedding-based dimensionality reduction approach, called structural Laplacian Eigenmaps, is proposed to learn models representing any concept that can be defined by a set of multivariate sequences. This approach relies on the expression of the intrinsic structure of the multivariate sequences in the form of structural constraints, which are imposed on dimensionality reduction process to generate a compact and data-driven manifold in a low dimensional space. This manifold is a… Expand
Learning representations from multiple manifolds
TLDR
A framework to learn joint embedding space from multiple manifold data, which can preserve the intra-manifolds' local geometric structure and the inter-manIFolds' correspondence structure and works as extensions to current state-of-the-art spectral-embedding approaches to handling multiple manifolds. Expand
Elastic Functional Coding of Riemannian Trajectories
TLDR
The TSRVF representation, and accompanying statistical summaries of Riemannian trajectories, are utilized to extend existing coding methods such as PCA, KSVD and Label Consistent KSVD to Riemanni trajectories or more generally to R Siemannian functions, showing that such coding efficiently captures trajectories in applications such as action recognition, stroke rehabilitation, visual speech recognition, clustering and diverse sequence sampling. Expand
Unsupervised feature selection by combining subspace learning with feature self-representation
TLDR
A novel unsupervised feature selection algorithm that uses the property of the data to construct self- representation coefficient matrix, and utilizes sparse representation to find the sparse structure of the self-representation coefficient matrix and embeds a hypergraph Laplacian regularization term in the representation of multiple relations. Expand
Low-rank unsupervised graph feature selection via feature self-representation
TLDR
This paper proposes a new feature-level self-representation loss function plus a sparsity regularizer (ℓ2,1-norm regularizer) to select representative features, and pushes a low-rank constraint on the coefficient matrix to avoid the impact of noise and outliers. Expand
Structured Manifold Broad Learning System: A Manifold Perspective for Large-Scale Chaotic Time Series Analysis and Prediction
TLDR
A unified framework for nonuniform embedding, dynamical system revealing, and time series prediction, termed as Structured Manifold Broad Learning System (SM-BLS), which provides a homogeneous way to recover the chaotic attractor from multivariate and heterogeneous time series. Expand
Graph self-representation method for unsupervised feature selection
TLDR
A new unsupervised feature selection by integrating a subspace learning method into a new feature selection method (i.e., Locality Preserving Projection) and adding a graph regularization term into the resulting feature selection model to simultaneously conduct feature selection and subspaceLearning. Expand
Unsupervised Hypergraph Feature Selection with Low-Rank and Self-Representation Constraints
TLDR
The feature-level self-representation property, a low-rank constraint, a hypergraph regularizer, and a sparsity inducing regularizer are integrated in a unified framework to conduct unsupervised feature selection to achieve the stable feature selection model. Expand
The Chaotic Attractor Analysis of DJIA Based on Manifold Embedding and Laplacian Eigenmaps
By using the techniques of Manifold Embedding and Laplacian Eigenmaps, a novel strategy has been proposed in this paper to detect the chaos of Dow Jones Industrial Average. Firstly, the chaoticExpand
1D representation of Laplacian eigenmaps and dual k-nearest neighbours for unified video coding
TLDR
It is evaluated by simulation experiments that the proposed framework of video coding based on Laplacian eigenmaps can attain better performance of BPP and PSNR than that of the state-of-the-art methods, such as highly efficient video coding. Expand
A clustered locally linear approach on face manifolds for pose estimation
TLDR
This paper proposes an unsupervised pose estimation method for face images based on clustered locally linear manifolds using discriminant analysis, and proposes that the local neighbourhood would be linear and provide better between-class separation, and hence, the classification problem would now be simpler. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 68 REFERENCES
Temporal Extension of Laplacian Eigenmaps for Unsupervised Dimensionality Reduction of Time Series
TLDR
A novel non-linear dimensionality reduction method, called Temporal Laplacian Eigenmaps, is introduced to process efficiently time series data and its lower computational cost and generalisation abilities suggest it is scalable to larger datasets. Expand
Patch Alignment for Dimensionality Reduction
TLDR
A new dimensionality reduction algorithm is developed, termed discrim inative locality alignment (DLA), by imposing discriminative information in the part optimization stage, and thorough empirical studies demonstrate the effectiveness of DLA compared with representative dimensionality Reduction algorithms. Expand
Using Laplacian eigenmaps latent variable model and manifold learning to improve speech recognition accuracy
TLDR
This paper demonstrates the application of the Laplacian eigenmaps latent variable model (LELVM) to the task of speech recognition and implies the superiority of the proposed method to the usual PCA methods. Expand
A global geometric framework for nonlinear dimensionality reduction.
TLDR
An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure. Expand
Topologically-constrained latent variable models
TLDR
A range of approaches for embedding data in a non-Euclidean latent space for the Gaussian Process latent variable model allows to learn transitions between motion styles even though such transitions are not present in the data. Expand
Nonlinear dimensionality reduction by locally linear embedding.
TLDR
Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds. Expand
Separating style and content on a nonlinear manifold
Bilinear and multi-linear models have been successful in decomposing static image ensembles into perceptually orthogonal sources of variations, e.g., separation of style and content. If we considerExpand
Learning a Joint Manifold Representation from Multiple Data Sets
TLDR
A framework to learn an embedding of all the points on all the manifolds in a way that preserves the local structure on each manifold and collapses all the different manifolds into one manifold in the embedding space, while preserving the implicit correspondences between the points across different data sets. Expand
Local distance preservation in the GP-LVM through back constraints
TLDR
This paper provides an overview of dimensionality reduction techniques, placing the emphasis on the kind of distance relation preserved, and shows how the GP-LVM can be generalized, through back constraints, to additionally preserve local distances. Expand
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
We present a new algorithm for manifold learning and nonlinear dimensionality reduction. Based on a set of unorganized da-ta points sampled with noise from a parameterized manifold, the localExpand
...
1
2
3
4
5
...