Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions

@article{Halko2011FindingSW,
  title={Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions},
  author={Nathan Halko and Per-Gunnar Martinsson and Joel A. Tropp},
  journal={SIAM Rev.},
  year={2011},
  volume={53},
  pages={217-288}
}
Low-rank matrix approximations, such as the truncated singular value decomposition and the rank-revealing QR decomposition, play a central role in data analysis and scientific computing. This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation. These techniques exploit modern computational architectures more fully than classical methods and open the possibility of dealing with truly massive data sets… Expand
Finding Structure with Randomness Probabilistic Algorithms for Approximate Matrix Decompositions Research Project Report – “Sparsity and Compressed Sensing” course
Matrix factorization is a powerful tool to achieve tasks efficiently in numerical linear algebra. A problem arises when we compute low-rank approximations for massive matrices: we have to reduce theExpand
The PowerURV algorithm for computing rank-revealing full factorizations
TLDR
This work introduces a new randomized algorithm for producing rank-revealing factorizations based on existing work by Demmel, Dumitriu and Holtz that results in close-to optimal low-rank approximations to a given matrix. Expand
Compressing Rank-Structured Matrices via Randomized Sampling
  • P. Martinsson
  • Mathematics, Computer Science
  • SIAM J. Sci. Comput.
  • 2016
TLDR
The proposed scheme is also useful in simplifying the implementation of certain operations on rank-structured matrices such as the matrix-matrix multiplication, low-rank update, addition, etc. Expand
Random Projections and Dimension Reduction
TLDR
It is described how randomization can be used to create more efficient algorithms to perform low-rank matrix approximation, as well as introducing a novel randomized algorithm for matrix decomposition. Expand
Randomized Matrix Decompositions using
Matrix decompositions are fundamental tools in the area of applied mathematics, statistical computing, and machine learning. In particular, low-rank matrix decompositions are vital, and widely usedExpand
Randomized Matrix Decompositions using R
TLDR
This work presents the R package rsvd, and provides a tutorial introduction to randomized matrix decompositions, showing the computational advantage over other methods implemented in R for approximating matrices with low-rank structure. Expand
Randomized methods for computing low-rank approximations of matrices
TLDR
The dissertation describes a set of randomized techniques for rapidly constructing a low-rank approximation to a matrix and presents a parallelized randomized scheme for computing a reduced rank Singular Value Decomposition. Expand
Low Rank Approximation and Decomposition of Large Matrices Using Error Correcting Codes
TLDR
It is shown how matrices from error correcting codes can be used to find such low rank approximations and matrix decompositions, and extended the framework to linear least squares regression problems. Expand
Randomized Projection for Rank-Revealing Matrix Factorizations and Low-Rank Approximations
TLDR
The truncated version of RQRCP provides a key initial step in the authors' truncated SVD approximation, TUXV, and opens up a new performance domain for large matrix factorizations that will support efficient problem-solving techniques for challenging applications in science, engineering, and data analysis. Expand
RANDOMIZED TECHNIQUES FOR MATRIX DECOMPOSITION AND ESTIMATING THE APPROXIMATE RANK OF A MATRIX
Finding low dimensional matrix approximations of the data is an essential task in data analysis and scientific computing. Recently, several randomization schemes have been demonstrated for performingExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 203 REFERENCES
On the existence and computation of rank-revealing LU factorizations
Abstract By exploring properties of Schur complements, this paper presents bounds on the existence of rank-revealing LU factorizations that are comparable with those of rank-revealing QRExpand
A fast and efficient algorithm for low-rank approximation of a matrix
TLDR
A fast and efficient algorithm which at first pre-processes matrix A in order to spread out information (energy) of every columns of A, then randomly selects some of its columns (or rows) and generates a rank-k approximation from the row space of these selected sets. Expand
Subspace Sampling and Relative-Error Matrix Approximation: Column-Based Methods
TLDR
Two polynomial time randomized algorithms that take as input a matrix A and return as output a matrix C sample the columns of A via the method of “subspace sampling,” so-named since the sampling probabilities depend on the lengths of the rows of the top singular vectors and since they ensure that they capture entirely a certain subspace of interest. Expand
Error Bounds for Random Matrix Approximation Schemes
Randomized matrix sparsification has proven to be a fruitful technique for producing faster algorithms in applications ranging from graph partitioning to semidefinite programming. In the decade orExpand
Relative-Error CUR Matrix Decompositions
TLDR
These two algorithms are the first polynomial time algorithms for such low-rank matrix approximations that come with relative-error guarantees; previously, in some cases, it was not even known whether such matrix decompositions exist. Expand
The Power of Convex Relaxation: Near-Optimal Matrix Completion
  • E. Candès, T. Tao
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 2010
TLDR
This paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors). Expand
Randomized algorithms for the low-rank approximation of matrices
TLDR
Two recently proposed randomized algorithms for the construction of low-rank approximations to matrices are described and shown to be considerably more efficient and reliable than the classical (deterministic) ones; they also parallelize naturally. Expand
Improved Approximation Algorithms for Large Matrices via Random Projections
  • Tamás Sarlós
  • Mathematics, Computer Science
  • 2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06)
  • 2006
TLDR
The key idea is that low dimensional embeddings can be used to eliminate data dependence and provide more versatile, linear time pass efficient matrix computation. Expand
Exact Matrix Completion via Convex Optimization
TLDR
It is proved that one can perfectly recover most low-rank matrices from what appears to be an incomplete set of entries, and that objects other than signals and images can be perfectly reconstructed from very limited information. Expand
Numerical linear algebra in the streaming model
TLDR
Near-optimal space bounds are given in the streaming model for linear algebra problems that include estimation of matrix products, linear regression, low-rank approximation, and approximation of matrix rank; results for turnstile updates are proved. Expand
...
1
2
3
4
5
...