Corpus ID: 88515787

Manifold Learning Using Kernel Density Estimation and Local Principal Components Analysis

@article{Mohammed2017ManifoldLU,
  title={Manifold Learning Using Kernel Density Estimation and Local Principal Components Analysis},
  author={Kitty Mohammed and Hariharan Narayanan},
  journal={arXiv: Statistics Theory},
  year={2017}
}
We consider the problem of recovering a $d-$dimensional manifold $\mathcal{M} \subset \mathbb{R}^n$ when provided with noiseless samples from $\mathcal{M}$. There are many algorithms (e.g., Isomap) that are used in practice to fit manifolds and thus reduce the dimensionality of a given data set. Ideally, the estimate $\mathcal{M}_\mathrm{put}$ of $\mathcal{M}$ should be an actual manifold of a certain smoothness; furthermore, $\mathcal{M}_\mathrm{put}$ should be arbitrarily close to $\mathcal{M… Expand

Figures and Tables from this paper

Non-Parametric Estimation of Manifolds from Noisy Data
TLDR
This work considers the problem of estimating a d dimensional sub-manifold of RD from a finite set of noisy samples and presents an algorithm that takes a point r from the tubular neighborhood and outputs p̂n ∈ RD, and T̂p⩽nM an element in the Grassmanian Gr(d,D). Expand
Reconstruction and Interpolation of Manifolds. I: The Geometric Whitney Problem
TLDR
Algorithmic procedures are developed that solve the geometric Whitney problem for a metric space and the manifold reconstruction problem in Euclidean space, and the computational complexity of these procedures are estimated. Expand
Manifold Hypothesis in Data Analysis: Double Geometrically-Probabilistic Approach to Manifold Dimension Estimation
TLDR
This work presents new approach to manifold hypothesis checking and underlying manifold dimension estimation and uses two very different methods simultaneously — one geometric, another probabilistic — and checks whether they give the same result. Expand
Manifold Fitting under Unbounded Noise
TLDR
This paper introduces a new manifold-fitting method, by which the output manifold is constructed by directly estimating the tangent spaces at the projected points on the underlying manifold, rather than at the sample points, to decrease the error caused by the noise. Expand
Manifold Fitting in Ambient Space
TLDR
This work extends the idea of subsampling to noisy datasets in high dimensional space and utilize the Moving Least Squares approach to approximate the underlying manifold using the Laplace-Beltrami operator and the MLS approach. Expand
A regression approach for explaining manifold embedding coordinates
TLDR
A method to explain embedding coordinates on a manifold as compositions of functions from a user-defined dictionary is proposed, which can be set up as a sparse Group Lasso recovery problem, find sufficient recovery conditions, and demonstrate its effectiveness on data. Expand
A generic unfolding algorithm for manifolds estimated by local linear approximations
TLDR
This work presents a new generic algorithm for unfolding manifolds that have been estimated by local linear approximations and is a combination of ideas from principal curves and density ridge estimation and tools from classical differential geometry. Expand
Fitting a Putative Manifold to Noisy Data
In the present work, we give a solution to the following question from manifold learning. Suppose data belonging to a high dimensional Euclidean space is drawn independently, identically distributedExpand
Manifold Coordinates with Physical Meaning
TLDR
This paper proposes a method to explain embedding coordinates of a manifold as non-linear compositions of functions from a user-defined dictionary, and shows that it can be set up as a sparse linear Group Lasso recovery problem, find sufficient recovery conditions, and demonstrate its effectiveness on data. Expand
Turing approximations, toric isometric embeddings & manifold convolutions
TLDR
A theoretical framework for combining extrinsic and intrinsic approaches to manifold convolution through isometric embeddings into tori is presented, and a convolution operator for a manifold of arbitrary topology and dimension is defined. Expand
...
1
2
...

References

SHOWING 1-10 OF 27 REFERENCES
Testing the Manifold Hypothesis
The hypothesis that high dimensional data tend to lie in the vicinity of a low dimensional manifold is the basis of manifold learning. The goal of this paper is to develop an algorithm (withExpand
Manifold Learning Theory and Applications
TLDR
Comprehensive in its coverage, this pioneering work explores this novel modality from algorithm creation to successful implementation of manifold learning, offering examples of applications in medical, biometrics, multimedia, and computer vision. Expand
Locally Defined Principal Curves and Surfaces
TLDR
A novel theoretical understanding of principal curves and surfaces, practical algorithms as general purpose machine learning tools, and applications of these algorithms to several practical problems are presented. Expand
Minimax Manifold Estimation
TLDR
The minimax rate depends only on thedimension of the manifold, not on the dimension of the space in which M is embedded, so the optimal rate of convergence is n^{-2/(2+d)}. Expand
A global geometric framework for nonlinear dimensionality reduction.
TLDR
An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure. Expand
Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
TLDR
This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering. Expand
Nonparametric Ridge Estimation
TLDR
Ridge estimation is an extension of mode finding and is useful for understanding the structure of a density and can be used to find hidden structure in point cloud data. Expand
Nonlinear dimensionality reduction by locally linear embedding.
TLDR
Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds. Expand
Concentration Inequalities - A Nonasymptotic Theory of Independence
TLDR
Deep connections with isoperimetric problems are revealed whilst special attention is paid to applications to the supremum of empirical processes. Expand
Manifold Estimation and Singular Deconvolution Under Hausdorff Loss
We find lower and upper bounds for the risk of estimating a manifold in Hausdorff distance under several models. We also show that there are close connections between manifold estimation and theExpand
...
1
2
3
...