# Tensor Random Projection for Low Memory Dimension Reduction

@article{Sun2021TensorRP, title={Tensor Random Projection for Low Memory Dimension Reduction}, author={Yiming Sun and Yang Guo and Joel A. Tropp and Madeleine Udell}, journal={ArXiv}, year={2021}, volume={abs/2105.00105} }

Random projections reduce the dimension of a set of vectors while preserving structural information, such as distances between vectors in the set. This paper proposes a novel use of row-product random matrices [18] in random projection, where we call it Tensor Random Projection (TRP). It requires substantially less memory than existing dimension reduction maps. The TRP map is formed as the Khatri-Rao product of several smaller random projections, and is compatible with any base random…

## 23 Citations

Tensorized Random Projections

- Computer ScienceAISTATS
- 2020

The theoretical analysis shows that the dense Gaussian matrix in JLT can be replaced by a low-rank tensor implicitly represented in compressed form with random factors, while still approximately preserving the Euclidean distance of the projected inputs.

Rademacher Random Projections with Tensor Networks

- Computer ScienceArXiv
- 2021

It is shown both theoretically and experimentally, that the tensorized RP in the Matrix Product Operator (MPO) format is not a Johnson-Lindenstrauss transform (JLT) and therefore not a wellsuited random projection map.

Tensor-structured sketching for constrained least squares

- Computer Science, Mathematics
- 2020

This work utilizes a general class of rowwise tensorized sub-Gaussian matrices as sketching matrices in constrained optimizations for the sketching design’s compatibility with tensor structures for optimization problems with general constraint sets.

Johnson-Lindenstrauss Embeddings with Kronecker Structure

- Mathematics, Computer ScienceArXiv
- 2021

We prove the Johnson-Lindenstrauss property for matrices ΦDξ where Φ has the restricted isometry property and Dξ is a diagonal matrix containing the entries of a Kronecker product ξ = ξ(1)⊗· · ·⊗ξ(d)…

Low-Rank Tucker Approximation of a Tensor From Streaming Data

- Computer ScienceSIAM J. Math. Data Sci.
- 2020

A new algorithm for computing a low-Tucker-rank approximation of a tensor that applies a randomized linear map to the tensor to obtain a sketch that captures the important directions within each mode, as well as the interactions among the modes.

Fast and Accurate Randomized Algorithms for Low-rank Tensor Decompositions

- Computer ScienceNeurIPS
- 2021

A fast and accurate sketched ALS algorithm for Tucker decomposition, which solves a sequence of sketched rank-constrained linear least squares subproblems, and which not only converges faster, but also yields more accurate CP decompositions.

RANDOMIZED SKETCHING ALGORITHMS FOR LOW-MEMORY DYNAMIC

- Computer Science
- 2019

Using randomized matrix approximation to compress the state as it is generated and showing how to use the compressed state to reliably solve the original dynamic optimization problem is suggested.

Streaming Low-Rank Matrix Approximation with an Application to Scientific Simulation

- Computer ScienceSIAM J. Sci. Comput.
- 2019

It is argued that randomized linear sketching is a natural tool for on-the-fly compression of data matrices that arise from large-scale scientific simulations and data collection and is less sensitive to parameter choices than previous techniques.

Randomized Sketching Algorithms for Low-Memory Dynamic Optimization

- Computer ScienceSIAM J. Optim.
- 2021

This paper suggests using randomized matrix approximation to compress the state as it is generated and shows how to use the compressed state to reliably solve the original dynamic optimization problem.

Parallel algorithms for computing the tensor-train decomposition

- Computer ScienceArXiv
- 2021

Four parallelizable algorithms that compute the TT format from various tensor inputs are proposed: Parallel-TTSVD for traditional format, PSTT and its variants for streaming data, Tucker2TT for Tucker format, and TT-fADI for solutions of Sylvester tensor equations.

## References

SHOWING 1-10 OF 25 REFERENCES

Random projection in dimensionality reduction: applications to image and text data

- Computer ScienceKDD '01
- 2001

It is shown that projecting the data onto a random lower-dimensional subspace yields results comparable to conventional dimensionality reduction methods such as principal component analysis: the similarity of data vectors is preserved well under random projection.

Very sparse random projections

- Computer ScienceKDD '06
- 2006

This paper proposes sparse random projections, an approximate algorithm for estimating distances between pairs of points in a high-dimensional vector space that multiplies A by a random matrix R in RD x k, reducing the D dimensions down to just k for speeding up the computation.

Database-friendly random projections: Johnson-Lindenstrauss with binary coins

- Computer Science, MathematicsJ. Comput. Syst. Sci.
- 2003

Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions

- Computer ScienceSIAM Rev.
- 2011

This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation, and presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions.

Toward a Unified Theory of Sparse Dimensionality Reduction in Euclidean Space

- Mathematics, Computer ScienceSTOC
- 2015

This work qualitatively unify several results related to the Johnson-Lindenstrauss lemma, subspace embeddings, and Fourier-based restricted isometries and introduces a new complexity parameter, which depends on the geometry of T, and shows that it suffices to choose s and m such that this parameter is small.

Experiments with random projections for machine learning

- Computer ScienceKDD '03
- 2003

It is found that the random projection approach predictively underperforms PCA, but its computational advantages may make it attractive for certain applications.

Dimensionality reduction by random mapping: fast similarity computation for clustering

- Computer Science1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227)
- 1998

It is demonstrated that the document classification accuracy obtained after the dimensionality has been reduced using a random mapping method will be almost as good as the original accuracy if the final dimensionality is sufficiently large.

Fast and Guaranteed Tensor Decomposition via Sketching

- Computer ScienceNIPS
- 2015

This paper proposes fast and randomized tensor CP decomposition algorithms based on sketching that combine existing whitening and tensor power iterative techniques to obtain the fastest algorithm on both sparse and dense tensors.

Random projection-based multiplicative data perturbation for privacy preserving distributed data mining

- Computer ScienceIEEE Transactions on Knowledge and Data Engineering
- 2006

This paper proposes an approximate random projection-based technique to improve the level of privacy protection while still preserving certain statistical characteristics of the data and presents extensive theoretical analysis and experimental results.