An almost optimal unrestricted fast Johnson-Lindenstrauss transform

@inproceedings{Ailon2011AnAO,
  title={An almost optimal unrestricted fast Johnson-Lindenstrauss transform},
  author={Nir Ailon and Edo Liberty},
  booktitle={SODA '11},
  year={2011}
}
The problems of random projections and sparse reconstruction have much in common and individually received much attention. Surprisingly, until now they progressed in parallel and remained mostly separate. Here, we employ new tools from probability in Banach spaces that were successfully used in the context of sparse reconstruction to advance on an open problem in random pojection. In particular, we generalize and use an intricate result by Rudelson and Vershynin for sparse reconstruction which… 
Optimal Bounds for Johnson-Lindenstrauss Transforms and Streaming Problems with Subconstant Error
TLDR
The techniques are based on lower bounding the information cost of a novel one-way communication game and yield the first space lower bounds in a data stream model that depend on the error probability of the Johnson-Lindenstrauss transform.
Optimal bounds for Johnson-Lindenstrauss transforms and streaming problems with sub-constant error
TLDR
The techniques are based on lower bounding the information cost of a novel one-way communication game and yield the first space lower bounds in a data stream model that depend on the error probability Δ, showing that for a wide range of problems this is in fact optimal.
Johnson-Lindenstrauss Transforms with Best Confidence
TLDR
This work develops Johnson-Lindenstrauss distributions with optimal, data-oblivious, statistical confidence bounds, which improve upon prior works in terms of statistical accuracy, as well as exactly determine the no-go regimes for data-OBlivious approaches.
Sparser Johnson-Lindenstrauss Transforms
TLDR
These are the first constructions to provide subconstant sparsity for all values of parameters, improving upon previous works of Achlioptas and Dasgupta et al.
New and Improved Johnson-Lindenstrauss Embeddings via the Restricted Isometry Property
TLDR
The results improve the best known bounds on the necessary embedding dimension m for a wide class of structured random matrices and improve the recent bound m = O(delta^(-4) log(p) log^4(N)) appearing in Ailon and Liberty, which is optimal up to the logarithmic factors in N.
Random Projections with Best Confidence
TLDR
This work develops Johnson-Lindenstrauss distributions with optimal, data-oblivious, statistical confidence bounds, which improve upon prior works in terms of statistical accuracy, as well as exactly determine the no-go regimes for data-OBlivious approaches.
On Deterministic Sketching and Streaming for Sparse Recovery and Norm Estimation
TLDR
This work focuses on devising a fixed matrix A in R^{m x n} and a deterministic recovery/estimation procedure which work for all possible input vectors simultaneously and can also obtain fast sketching and recovery algorithms by making use of the Fast Johnson-Lindenstrauss transform.
Toward a Unified Theory of Sparse Dimensionality Reduction in Euclidean Space
TLDR
This work qualitatively unify several results related to the Johnson-Lindenstrauss lemma, subspace embeddings, and Fourier-based restricted isometries and introduces a new complexity parameter, which depends on the geometry of T, and shows that it suffices to choose s and m such that this parameter is small.
Optimal Fast Johnson-Lindenstrauss Embeddings for Large Data Sets
TLDR
A lower bound is proved showing that subsampled Hadamard matrices alone cannot reach an optimal embedding dimension, and it is proved that the second embedding cannot be omitted.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 35 REFERENCES
Optimal Bounds for Johnson-Lindenstrauss Transforms and Streaming Problems with Subconstant Error
TLDR
The techniques are based on lower bounding the information cost of a novel one-way communication game and yield the first space lower bounds in a data stream model that depend on the error probability of the Johnson-Lindenstrauss transform.
Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform
TLDR
A new low-distortion embedding of l<sub>2</sub><sup>d</sup> into l p (p=1,2) is introduced, called the Fast-Johnson-Linden-strauss-Transform (FJLT), based upon the preconditioning of a sparse projection matrix with a randomized Fourier transform.
A sparse Johnson: Lindenstrauss transform
TLDR
A sparse version of the fundamental tool in dimension reduction -- the Johnson-Lindenstrauss transform is obtained, using hashing and local densification to construct a sparse projection matrix with just ~O(1/ε) non-zero entries per column, and a matching lower bound on the sparsity for a large class of projection matrices is shown.
New and Improved Johnson-Lindenstrauss Embeddings via the Restricted Isometry Property
TLDR
The results improve the best known bounds on the necessary embedding dimension m for a wide class of structured random matrices and improve the recent bound m = O(delta^(-4) log(p) log^4(N)) appearing in Ailon and Liberty, which is optimal up to the logarithmic factors in N.
On variants of the Johnson–Lindenstrauss lemma
TLDR
A simple and self-contained proof of a version of the JohnsonLindenstrauss lemma that subsumes a basic versions by Indyk and Motwani and a version more suitable for efficient computations due to Achlioptas is given.
Fast Dimension Reduction Using Rademacher Series on Dual BCH Codes
TLDR
This work shows how to significantly improve the running time to O(dlog k) for k=O(d1/2−δ), for any arbitrary small fixed δ, which beats the better of FJLT and JL.
On sparse reconstruction from Fourier and Gaussian measurements
TLDR
This paper improves upon best‐known guarantees for exact reconstruction of a sparse signal f from a small universal sample of Fourier measurements by showing that there exists a set of frequencies Ω such that one can exactly reconstruct every r‐sparse signal f of length n from its frequencies in Ω, using the convex relaxation.
A Simple Proof of the Restricted Isometry Property for Random Matrices
Abstract We give a simple technique for verifying the Restricted Isometry Property (as introduced by Candès and Tao) for random matrices that underlies Compressed Sensing. Our approach has two main
Improved Approximation Algorithms for Large Matrices via Random Projections
  • Tamás Sarlós
  • Computer Science
    2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06)
  • 2006
TLDR
The key idea is that low dimensional embeddings can be used to eliminate data dependence and provide more versatile, linear time pass efficient matrix computation.
Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements
TLDR
The first guarantees for universal measurements (i.e. which work for all sparse functions) with reasonable constants are proved, based on the technique of geometric functional analysis and probability in Banach spaces.
...
1
2
3
4
...