Corpus ID: 230437590

Random Embeddings with Optimal Accuracy

  title={Random Embeddings with Optimal Accuracy},
  author={Maciej Skorski},
  • M. Skorski
  • Published 31 December 2020
  • Computer Science, Mathematics
  • ArXiv
This work constructs Jonson-Lindenstrauss embeddings with best accuracy, as measured by variance, mean-squared error and exponential concentration of the length distortion. Lower bounds for any data and embedding dimensions are determined, and accompanied by matching and efficiently samplable constructions (built on orthogonal matrices). Novel techniques: a unit sphere parametrization, the use of singular-value latent variables and Schur-convexity are of independent interest. 

Figures from this paper


Optimal Bounds for Johnson-Lindenstrauss Transformations
This paper provides a precise asymptotic threshold for the dimension of the image, above which, there exists a projection preserving the Euclidean distance, but, below, there does not exist such a projection. Expand
The Johnson-Lindenstrauss Transform: An Empirical Study
This paper presents the first comprehensive study of the empirical behavior of algorithms for dimensionality reduction based on the JL Lemma, and answers a number of important questions about the quality of the embeddings and the performance of algorithms used to compute them. Expand
Nearly Tight Oblivious Subspace Embeddings by Trace Inequalities
This analysis of sparse oblivious subspace embeddings is presented, based on the "matrix Chernoff" technique, and the bounds obtained are much tighter than previous ones, matching known lower bounds up to a single log(d) factor in embedding dimension. Expand
New and Improved Johnson-Lindenstrauss Embeddings via the Restricted Isometry Property
The results improve the best known bounds on the necessary embedding dimension m for a wide class of structured random matrices and improve the recent bound m = O(delta^(-4) log(p) log^4(N)) appearing in Ailon and Liberty, which is optimal up to the logarithmic factors in N. Expand
Dimensionality Reduction with Subgaussian Matrices: A Unified Theory
  • S. Dirksen
  • Mathematics, Computer Science
  • Found. Comput. Math.
  • 2016
We present a theory for Euclidean dimensionality reduction with subgaussian matrices which unifies several restricted isometry property and Johnson–Lindenstrauss-type results obtained earlier forExpand
Almost Optimal Explicit Johnson-Lindenstrauss Families
This work gives explicit constructions with an almost optimal use of randomness of linear embeddings satisfying the Johnson-Lindenstrauss property, showing a lower bound of Ω(log(1/δ)/e2) on the embedding dimension. Expand
A sparse Johnson: Lindenstrauss transform
A sparse version of the fundamental tool in dimension reduction -- the Johnson-Lindenstrauss transform is obtained, using hashing and local densification to construct a sparse projection matrix with just ~O(1/ε) non-zero entries per column, and a matching lower bound on the sparsity for a large class of projection matrices is shown. Expand
New bounds for circulant Johnson-Lindenstrauss embeddings
  • Hui Zhang, L. Cheng
  • Mathematics, Computer Science
  • ArXiv
  • 2013
The bounds in this paper offer a small improvement over the current best bounds for Gaussian circulant JL embeddings for certain parameter regimes and are derived using more direct methods. Expand
Hanson-Wright inequality and sub-gaussian concentration
In this expository note, we give a modern proof of Hanson-Wright inequality for quadratic forms in sub-gaussian random variables.We deduce a useful concentration inequality for sub-gaussian randomExpand
Approximate nearest neighbors: towards removing the curse of dimensionality
Two algorithms for the approximate nearest neighbor problem in high-dimensional spaces are presented, which require space that is only polynomial in n and d, while achieving query times that are sub-linear inn and polynometric in d. Expand