• Publications
  • Influence
Locality Preserving Projections
TLDR
These are linear projective maps that arise by solving a variational problem that optimally preserves the neighborhood structure of the data set by finding the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the manifold.
Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
TLDR
This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering.
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
TLDR
A semi-supervised framework that incorporates labeled and unlabeled data in a general-purpose learner is proposed and properties of reproducing kernel Hilbert spaces are used to prove new Representer theorems that provide theoretical basis for the algorithms.
Face recognition using Laplacianfaces
TLDR
Experimental results suggest that the proposed Laplacianface approach provides a better representation and achieves lower error rates in face recognition.
Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering
TLDR
The algorithm provides a computationally efficient approach to nonlinear dimensionality reduction that has locality preserving properties and a natural connection to clustering.
Laplacian Score for Feature Selection
TLDR
This paper proposes a "filter" method for feature selection which is independent of any learning algorithm, based on the observation that, in many real world classification problems, data from the same class are often close to each other.
Finding the Homology of Submanifolds with High Confidence from Random Samples
TLDR
This work considers the case where data are drawn from sampling a probability distribution that has support on or near a submanifold of Euclidean space and shows how to “learn” the homology of the sub manifold with high confidence.
Beyond the point cloud: from transductive to semi-supervised learning
TLDR
This paper constructs a family of data-dependent norms on Reproducing Kernel Hilbert Spaces (RKHS) that allow the structure of the RKHS to reflect the underlying geometry of the data.
Semi-Supervised Learning on Riemannian Manifolds
TLDR
An algorithmic framework to classify a partially labeled data set in a principled manner and models the manifold using the adjacency graph for the data and approximates the Laplace-Beltrami operator by the graph Laplacian.
Tensor Subspace Analysis
TLDR
A new algorithm called Tensor Subspace Analysis (TSA) is proposed that detects the intrinsic local geometrical structure of the tensor space by learning a lower dimensional tensor subspace and achieves better recognition rate, while being much more efficient.
...
...