• Corpus ID: 238856761

Inferring Manifolds From Noisy Data Using Gaussian Processes

@article{Dunson2021InferringMF,
  title={Inferring Manifolds From Noisy Data Using Gaussian Processes},
  author={David B. Dunson and Nan Wu},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.07478}
}
  • D. Dunson, Nan Wu
  • Published 14 October 2021
  • Computer Science, Mathematics
  • ArXiv
In analyzing complex datasets, it is often of interest to infer lower dimensional structure underlying the higher dimensional observations. As a flexible class of nonlinear structures, it is common to focus on Riemannian manifolds. Most existing manifold learning algorithms replace the original data with lower dimensional coordinates without providing an estimate of the manifold in the observation space or using the manifold to denoise the original data. This article proposes a new methodology… 
Latent structure blockmodels for Bayesian spectral graph clustering
TLDR
A class of models called latent structure block models (LSBM) is proposed, allowing for graph clustering when community-specific one dimensional manifold structure is present, and is shown to have a good performance on simulated and real world network data.

References

SHOWING 1-10 OF 29 REFERENCES
Non-Asymptotic Analysis of Tangent Space Perturbation
TLDR
This work studies the stability of the subspace estimated by PCA as a function of scale, and bound (with high probability) the angle it forms with the true tangent space, and reveals an appropriate scale for local tangent plane recovery.
Fitting a Putative Manifold to Noisy Data
In the present work, we give a solution to the following question from manifold learning. Suppose data belonging to a high dimensional Euclidean space is drawn independently, identically distributed
A global geometric framework for nonlinear dimensionality reduction.
TLDR
An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure.
Regularized Principal Manifolds
TLDR
This work proposes an algorithm for finding principal manifolds that can be regularized in a variety of ways and gives good bounds on the covering numbers which allows it to obtain a nearly optimal learning rate of order O(m-1/2+α) for certain types of regularization operators.
Nonlinear dimensionality reduction by locally linear embedding.
TLDR
Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.
Diffusion Based Gaussian Processes on Restricted Domains
TLDR
This article proposes a new class of diffusion-based GPs (DB-GPs), which learn a covariance that respects the geometry of the input domain, and approximate the covariance using finitely-many eigenpairs of the Graph Laplacian.
Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
TLDR
This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering.
Non-Parametric Estimation of Manifolds from Noisy Data
TLDR
This work considers the problem of estimating a d dimensional sub-manifold of RD from a finite set of noisy samples and presents an algorithm that takes a point r from the tubular neighborhood and outputs p̂n ∈ RD, and T̂p⩽nM an element in the Grassmanian Gr(d,D).
Think globally, fit locally under the manifold setup: Asymptotic analysis of locally linear embedding
Since its introduction in 2000, the locally linear embedding (LLE) has been widely applied in data science. We provide an asymptotical analysis of the LLE under the manifold setup. We show that for
Vector Diffusion Maps and the Connection Laplacian.
  • A. Singer, H.-T. Wu
  • Medicine, Mathematics
    Communications on pure and applied mathematics
  • 2012
TLDR
The relation between VDM and the connection Laplacian operator for vector fields over the manifold is proved and it is proved that the data equips the data with a metric, which is referred to as the vector diffusion distance.
...
1
2
3
...