# Inferring Manifolds From Noisy Data Using Gaussian Processes

@article{Dunson2021InferringMF, title={Inferring Manifolds From Noisy Data Using Gaussian Processes}, author={David B. Dunson and Nan Wu}, journal={ArXiv}, year={2021}, volume={abs/2110.07478} }

In analyzing complex datasets, it is often of interest to infer lower dimensional structure underlying the higher dimensional observations. As a flexible class of nonlinear structures, it is common to focus on Riemannian manifolds. Most existing manifold learning algorithms replace the original data with lower dimensional coordinates without providing an estimate of the manifold in the observation space or using the manifold to denoise the original data. This article proposes a new methodology…

## Figures from this paper

## One Citation

Latent structure blockmodels for Bayesian spectral graph clustering

- Computer Science, MathematicsArXiv
- 2021

A class of models called latent structure block models (LSBM) is proposed, allowing for graph clustering when community-specific one dimensional manifold structure is present, and is shown to have a good performance on simulated and real world network data.

## References

SHOWING 1-10 OF 29 REFERENCES

Non-Asymptotic Analysis of Tangent Space Perturbation

- Mathematics, Physics
- 2011

This work studies the stability of the subspace estimated by PCA as a function of scale, and bound (with high probability) the angle it forms with the true tangent space, and reveals an appropriate scale for local tangent plane recovery.

Fitting a Putative Manifold to Noisy Data

- Computer ScienceCOLT
- 2018

In the present work, we give a solution to the following question from manifold learning. Suppose data belonging to a high dimensional Euclidean space is drawn independently, identically distributed…

A global geometric framework for nonlinear dimensionality reduction.

- MedicineScience
- 2000

An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure.

Regularized Principal Manifolds

- Computer ScienceJ. Mach. Learn. Res.
- 2001

This work proposes an algorithm for finding principal manifolds that can be regularized in a variety of ways and gives good bounds on the covering numbers which allows it to obtain a nearly optimal learning rate of order O(m-1/2+α) for certain types of regularization operators.

Nonlinear dimensionality reduction by locally linear embedding.

- Medicine, Computer ScienceScience
- 2000

Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.

Diffusion Based Gaussian Processes on Restricted Domains

- Computer Science
- 2020

This article proposes a new class of diffusion-based GPs (DB-GPs), which learn a covariance that respects the geometry of the input domain, and approximate the covariance using finitely-many eigenpairs of the Graph Laplacian.

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

- Computer Science, MathematicsNeural Computation
- 2003

This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering.

Non-Parametric Estimation of Manifolds from Noisy Data

- Mathematics, Computer ScienceArXiv
- 2021

This work considers the problem of estimating a d dimensional sub-manifold of RD from a finite set of noisy samples and presents an algorithm that takes a point r from the tubular neighborhood and outputs p̂n ∈ RD, and T̂p⩽nM an element in the Grassmanian Gr(d,D).

Think globally, fit locally under the manifold setup: Asymptotic analysis of locally linear embedding

- MathematicsThe Annals of Statistics
- 2018

Since its introduction in 2000, the locally linear embedding (LLE) has been widely applied in data science. We provide an asymptotical analysis of the LLE under the manifold setup. We show that for…

Vector Diffusion Maps and the Connection Laplacian.

- Medicine, MathematicsCommunications on pure and applied mathematics
- 2012

The relation between VDM and the connection Laplacian operator for vector fields over the manifold is proved and it is proved that the data equips the data with a metric, which is referred to as the vector diffusion distance.