# An almost optimal unrestricted fast Johnson-Lindenstrauss transform

@inproceedings{Ailon2011AnAO, title={An almost optimal unrestricted fast Johnson-Lindenstrauss transform}, author={Nir Ailon and Edo Liberty}, booktitle={SODA '11}, year={2011} }

The problems of random projections and sparse reconstruction have much in common and individually received much attention. Surprisingly, until now they progressed in parallel and remained mostly separate. Here, we employ new tools from probability in Banach spaces that were successfully used in the context of sparse reconstruction to advance on an open problem in random pojection. In particular, we generalize and use an intricate result by Rudelson and Vershynin for sparse reconstruction which…

## 143 Citations

Optimal Bounds for Johnson-Lindenstrauss Transforms and Streaming Problems with Subconstant Error

- Computer ScienceTALG
- 2013

The techniques are based on lower bounding the information cost of a novel one-way communication game and yield the first space lower bounds in a data stream model that depend on the error probability of the Johnson-Lindenstrauss transform.

Optimal bounds for Johnson-Lindenstrauss transforms and streaming problems with sub-constant error

- Computer ScienceSODA '11
- 2011

The techniques are based on lower bounding the information cost of a novel one-way communication game and yield the first space lower bounds in a data stream model that depend on the error probability Δ, showing that for a wide range of problems this is in fact optimal.

Johnson-Lindenstrauss Transforms with Best Confidence

- Computer Science, MathematicsCOLT
- 2021

This work develops Johnson-Lindenstrauss distributions with optimal, data-oblivious, statistical confidence bounds, which improve upon prior works in terms of statistical accuracy, as well as exactly determine the no-go regimes for data-OBlivious approaches.

Sparser Johnson-Lindenstrauss Transforms

- Computer ScienceJACM
- 2014

These are the first constructions to provide subconstant sparsity for all values of parameters, improving upon previous works of Achlioptas and Dasgupta et al.

New and Improved Johnson-Lindenstrauss Embeddings via the Restricted Isometry Property

- Mathematics, Computer ScienceSIAM J. Math. Anal.
- 2011

The results improve the best known bounds on the necessary embedding dimension m for a wide class of structured random matrices and improve the recent bound m = O(delta^(-4) log(p) log^4(N)) appearing in Ailon and Liberty, which is optimal up to the logarithmic factors in N.

Random Projections with Best Confidence

- Computer Science, Mathematics
- 2021

This work develops Johnson-Lindenstrauss distributions with optimal, data-oblivious, statistical confidence bounds, which improve upon prior works in terms of statistical accuracy, as well as exactly determine the no-go regimes for data-OBlivious approaches.

On Deterministic Sketching and Streaming for Sparse Recovery and Norm Estimation

- Computer ScienceAPPROX-RANDOM
- 2012

This work focuses on devising a fixed matrix A in R^{m x n} and a deterministic recovery/estimation procedure which work for all possible input vectors simultaneously and can also obtain fast sketching and recovery algorithms by making use of the Fast Johnson-Lindenstrauss transform.

On deterministic sketching and streaming for sparse recovery and norm estimation

- Computer Science
- 2014

Toward a Unified Theory of Sparse Dimensionality Reduction in Euclidean Space

- Mathematics, Computer ScienceSTOC
- 2015

This work qualitatively unify several results related to the Johnson-Lindenstrauss lemma, subspace embeddings, and Fourier-based restricted isometries and introduces a new complexity parameter, which depends on the geometry of T, and shows that it suffices to choose s and m such that this parameter is small.

Optimal Fast Johnson-Lindenstrauss Embeddings for Large Data Sets

- Computer ScienceSampling Theory, Signal Processing, and Data Analysis
- 2017

A lower bound is proved showing that subsampled Hadamard matrices alone cannot reach an optimal embedding dimension, and it is proved that the second embedding cannot be omitted.

## References

SHOWING 1-10 OF 35 REFERENCES

Optimal Bounds for Johnson-Lindenstrauss Transforms and Streaming Problems with Subconstant Error

- Computer ScienceTALG
- 2013

The techniques are based on lower bounding the information cost of a novel one-way communication game and yield the first space lower bounds in a data stream model that depend on the error probability of the Johnson-Lindenstrauss transform.

Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform

- Computer ScienceSTOC '06
- 2006

A new low-distortion embedding of l<sub>2</sub><sup>d</sup> into l p (p=1,2) is introduced, called the Fast-Johnson-Linden-strauss-Transform (FJLT), based upon the preconditioning of a sparse projection matrix with a randomized Fourier transform.

A sparse Johnson: Lindenstrauss transform

- Computer ScienceSTOC '10
- 2010

A sparse version of the fundamental tool in dimension reduction -- the Johnson-Lindenstrauss transform is obtained, using hashing and local densification to construct a sparse projection matrix with just ~O(1/ε) non-zero entries per column, and a matching lower bound on the sparsity for a large class of projection matrices is shown.

New and Improved Johnson-Lindenstrauss Embeddings via the Restricted Isometry Property

- Mathematics, Computer ScienceSIAM J. Math. Anal.
- 2011

The results improve the best known bounds on the necessary embedding dimension m for a wide class of structured random matrices and improve the recent bound m = O(delta^(-4) log(p) log^4(N)) appearing in Ailon and Liberty, which is optimal up to the logarithmic factors in N.

On variants of the Johnson–Lindenstrauss lemma

- Mathematics, Computer Science
- 2008

A simple and self-contained proof of a version of the JohnsonLindenstrauss lemma that subsumes a basic versions by Indyk and Motwani and a version more suitable for efficient computations due to Achlioptas is given.

Fast Dimension Reduction Using Rademacher Series on Dual BCH Codes

- Computer ScienceSODA '08
- 2008

This work shows how to significantly improve the running time to O(dlog k) for k=O(d1/2−δ), for any arbitrary small fixed δ, which beats the better of FJLT and JL.

On sparse reconstruction from Fourier and Gaussian measurements

- Computer Science, Mathematics
- 2008

This paper improves upon best‐known guarantees for exact reconstruction of a sparse signal f from a small universal sample of Fourier measurements by showing that there exists a set of frequencies Ω such that one can exactly reconstruct every r‐sparse signal f of length n from its frequencies in Ω, using the convex relaxation.

A Simple Proof of the Restricted Isometry Property for Random Matrices

- Mathematics
- 2008

Abstract
We give a simple technique for verifying the Restricted Isometry Property (as introduced by Candès and Tao) for random matrices that underlies Compressed Sensing. Our approach has two main…

Improved Approximation Algorithms for Large Matrices via Random Projections

- Computer Science2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06)
- 2006

The key idea is that low dimensional embeddings can be used to eliminate data dependence and provide more versatile, linear time pass efficient matrix computation.

Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements

- Computer Science2006 40th Annual Conference on Information Sciences and Systems
- 2006

The first guarantees for universal measurements (i.e. which work for all sparse functions) with reasonable constants are proved, based on the technique of geometric functional analysis and probability in Banach spaces.