• Corpus ID: 18074198

Random projections of random manifolds

@article{Lahiri2016RandomPO,
  title={Random projections of random manifolds},
  author={Subhaneil Lahiri and Peiran Gao and Surya Ganguli},
  journal={ArXiv},
  year={2016},
  volume={abs/1607.04331}
}
Interesting data often concentrate on low dimensional smooth manifolds inside a high dimensional ambient space. Random projections are a simple, powerful tool for dimensionality reduction of such data. Previous works have studied bounds on how many projections are needed to accurately preserve the geometry of these manifolds, given their intrinsic dimensionality, volume and curvature. However, such works employ definitions of volume and curvature that are inherently difficult to compute… 
Expected path length on random manifolds
TLDR
This work endow the latent space of a large class of generative models with a random Riemannian metric, which provides them with elementary operators and researches deterministic approximations and tight error bounds on expected distances.
On Fast Johnson-Lindenstrauss Embeddings of Compact Submanifolds of $\mathbb{R}^N$ with Boundary
TLDR
A new class of highly structured distributions on matrices which outperform prior structured matrix distributions for embedding sufficiently low-dimensional submanifolds of R (with d . √ N) with respect to both achievable embedding dimension, and computationally efficient realizations are presented.
Accurate estimation of neural population dynamics without spike sorting
TLDR
It is found that neural dynamics and scientific conclusions are quite similar using multi-unit threshold crossings in place of sorted neurons, which unlocks existing data for new analyses and informs the design of new electrode arrays for laboratory and clinical use.
Accurate Estimation of Neural Population Dynamics without Spike Sorting
TLDR
This work recorded data using Neuropixels probes in motor cortex of nonhuman primates and reanalyzed data from three previous studies and found that neural dynamics and scientific conclusions are quite similar using multiunit threshold crossings rather than sorted neurons.
Predictive learning extracts latent space representations from sensory observations
TLDR
Using a recurrent neural network model trained to predict a sequence of observations in a simulated spatial navigation task, it is shown that network dynamics exhibit low-dimensional but nonlinearly transformed representations of sensory inputs that capture the latent structure of the sensory environment.
Disease Prediction Using Metagenomic Data Visualizations Based on Manifold Learning and Convolutional Neural Network
TLDR
Several approaches based on dimensionality reduction algorithms and data density to visualize features which reflect the species abundance are introduced and allow to visualize bio-medical signatures and improve the prediction performance compared to classical machine learning.

References

SHOWING 1-10 OF 26 REFERENCES
Random Projections of Smooth Manifolds
Abstract We propose a new approach for nonadaptive dimensionality reduction of manifold-modeled data, demonstrating that a small number of random linear projections can preserve key information about
Tighter bounds for random projections of manifolds
TLDR
Here the case of random projection of smooth manifolds is considered, and a previous analysis is sharpened, reducing the dependence on such properties as the manifold's maximum curvature.
Random Projections for Manifold Learning
TLDR
This work rigorously proves that with a small number M of random projections of sample points in ℝN belonging to an unknown K-dimensional Euclidean manifold, the intrinsic dimension (ID) of the sample set can be estimated to high accuracy.
Universality laws for randomized dimension reduction, with applications
TLDR
It is proved that there is a phase transition in the success probability of the dimension reduction map as the embedding dimension increases, and each map has the same stability properties, as quantified through the restricted minimum singular value.
Multiscale Random Projections for Compressive Classification
TLDR
This work develops the multiscale smashed filter as a compressive analog of the familiar matched filter classifier in a practical target classification problem using a single-pixel camera that directly acquires compressive image projections.
A Simple Proof of the Restricted Isometry Property for Random Matrices
Abstract We give a simple technique for verifying the Restricted Isometry Property (as introduced by Candès and Tao) for random matrices that underlies Compressed Sensing. Our approach has two main
Compressed and Privacy-Sensitive Sparse Regression
TLDR
This line of work shows that lscr1 -regularized least squares regression can accurately estimate a sparse linear model from noisy examples in high dimensions and characterize the number of projections that are required to identify the nonzero coefficients in the true model with probability approaching one, a property called ldquosparsistence.
Living on the edge: phase transitions in convex programs with random data
TLDR
This paper provides the first rigorous analysis that explains why phase transitions are ubiquitous in random convex optimization problems and introduces a summary parameter, called the statistical dimension, that canonically extends the dimension of a linear subspace to the class of convex cones.
Approximate nearest neighbors: towards removing the curse of dimensionality
TLDR
Two algorithms for the approximate nearest neighbor problem in high-dimensional spaces are presented, which require space that is only polynomial in n and d, while achieving query times that are sub-linear inn and polynometric in d.
Random Projection, Margins, Kernels, and Feature-Selection
TLDR
It is discussed how, given a kernel as a black-box function, the authors can use various forms of random projection to extract an explicit small feature space that captures much of what the kernel is doing.
...
1
2
3
...