Approximation of Points on Low-Dimensional Manifolds Via Random Linear Projections

@article{Iwen2012ApproximationOP,
  title={Approximation of Points on Low-Dimensional Manifolds Via Random Linear Projections},
  author={Mark A. Iwen and Mauro Maggioni},
  journal={ArXiv},
  year={2012},
  volume={abs/1204.3337}
}
This paper considers the approximate reconstruction of points, x \in R^D, which are close to a given compact d-dimensional submanifold, M, of R^D using a small number of linear measurements of x. In particular, it is shown that a number of measurements of x which is independent of the extrinsic dimension D suffices for highly accurate reconstruction of a given x with high probability. Furthermore, it is also proven that all vectors, x, which are sufficiently close to M can be reconstructed with… 

Figures from this paper

Learning adaptive multiscale approximations to data and functions near low-dimensional sets
TLDR
This work constructs multiscale low-dimensional empirical approximations of M, which are adaptive when M has geometric regularity that may vary at different locations and scales, and proves guarantees showing that they attain the same learning rates as if f was defined on a Euclidean domain of dimension d, instead of an unknown manifold M.
Geometric estimation of probability measures in high-dimensions
  • M. Maggioni
  • Computer Science, Mathematics
    2013 Asilomar Conference on Signals, Systems and Computers
  • 2013
TLDR
A family of estimators for probability distributions based on data-adaptive multiscale geometric approximations are discussed, particularly effective when the probability distribution concentrates near low-dimensional sets.
What Happens to a Manifold Under a Bi-Lipschitz Map?
TLDR
A lower bound on the reach of the embedded manifold is established in the case where m≤n and the bi-Lipschitz map is linear.
Testing the Manifold Hypothesis
The hypothesis that high dimensional data tend to lie in the vicinity of a low dimensional manifold is the basis of manifold learning. The goal of this paper is to develop an algorithm (with
On Fast Johnson-Lindenstrauss Embeddings of Compact Submanifolds of $\mathbb{R}^N$ with Boundary
TLDR
A new class of highly structured distributions on matrices which outperform prior structured matrix distributions for embedding suficiently low-dimensional submanifolds of R N with respect to both achievable embedding dimension, and computationally efficient realizations is presented.
Dictionary Learning and Non‐Asymptotic Bounds for Geometric Multi‐Resolution Analysis
TLDR
This work introduces an estimator for low‐dimensional sets supporting the data constructed from the GMRA approximations, exhibits (near optimal) finite sample bounds on its performance, and demonstrates the robustness of this estimator with respect to noise and model error.
Adaptive Geometric Multiscale Approximations for Intrinsically Low-dimensional Data
We consider the problem of efficiently approximating and encoding high-dimensional data sampled from a probability distribution $\rho$ in $\mathbb{R}^D$, that is nearly supported on a $d$-dimensional
On Recovery Guarantees for One-Bit Compressed Sensing on Manifolds
TLDR
This paper provides a convex recovery method based on the Geometric Multi-Resolution Analysis and proves recovery guarantees with a near-optimal scaling in the intrinsic manifold dimension, the first tractable algorithm with such guarantees for this setting.
...
...

References

SHOWING 1-10 OF 78 REFERENCES
Random Projections of Smooth Manifolds
Abstract We propose a new approach for nonadaptive dimensionality reduction of manifold-modeled data, demonstrating that a small number of random linear projections can preserve key information about
Tighter bounds for random projections of manifolds
TLDR
Here the case of random projection of smooth manifolds is considered, and a previous analysis is sharpened, reducing the dependence on such properties as the manifold's maximum curvature.
Nonlinear Dimension Reduction via Local Tangent Space Alignment
TLDR
A new algorithm for manifold learning and nonlinear dimension reduction is presented based on a set of unorganized data points sampled with noise from the manifold using tangent spaces learned by fitting an affine subspace in a neighborhood of each data point.
Hessian Eigenmaps : new locally linear embedding techniques for high-dimensional data
TLDR
The Hessian-based Locally Linear Embedding (HLLE) derives from a conceptual framework of Local Isometry in which the manifold M, viewed as a Riemannian submanifold of the ambient Euclidean space Rn, is locally isometric to an open, connected subset Θ of Euclidan space Rd.
Finding the Homology of Submanifolds with High Confidence from Random Samples
TLDR
This work considers the case where data are drawn from sampling a probability distribution that has support on or near a submanifold of Euclidean space and shows how to “learn” the homology of the sub manifold with high confidence.
Charting a Manifold
  • M. Brand
  • Mathematics, Computer Science
    NIPS
  • 2002
TLDR
A nonlinear mapping from a high-dimensional sample space to a low-dimensional vector space is constructed, effectively recovering a Cartesian coordinate system for the manifold from which the data is sampled, and is pseudo-invertible.
Iterative projections for signal identification on manifolds: Global recovery guarantees
  • P. Shah, V. Chandrasekaran
  • Mathematics, Computer Science
    2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
  • 2011
We introduce an algorithm known as Manifold Iterative Projection to solve the problem of recovering an unknown high-dimensional signal contained in a low-dimensional sub-manifold from a few linear
The multiscale structure of non-differentiable image manifolds
TLDR
This paper studies families of images generated by varying a parameter that controls the appearance of the object/scene in each image, finding that IAMs generated by images with sharp edges are nowhere differentiable and have an inherent multiscale structure.
Multiscale geometric and spectral analysis of plane arrangements
TLDR
This paper proposes an efficient algorithm based on multiscale SVD analysis and spectral methods to tackle the problem in full generality and demonstrates its state-of-the-art performance on both synthetic and real data.
Manifold-Based Signal Recovery and Parameter Estimation from Compressive Measurements
TLDR
This work establishes both deterministic and probabilistic instance-optimal bounds in $\ell_2$ for manifold-based signal recovery and parameter estimation from noisy compressive measurements, and supports the growing empirical evidence that manifold- based models can be used with high accuracy in compressive signal processing.
...
...