Sketching as a Tool for Numerical Linear Algebra

@article{Woodruff2014SketchingAA,
  title={Sketching as a Tool for Numerical Linear Algebra},
  author={David P. Woodruff},
  journal={Found. Trends Theor. Comput. Sci.},
  year={2014},
  volume={10},
  pages={1-157}
}
This survey highlights the recent advances in algorithms for numericallinear algebra that have come from the technique of linear sketching,whereby given a matrix, one first compresses it to a much smaller matrixby multiplying it by a (usually) random matrix with certain properties.Much of the expensive computation can then be performed onthe smaller matrix, thereby accelerating the solution for the originalproblem. In this survey we consider least squares as well as robust regressionproblems… Expand
An Empirical Evaluation of Sketching for Numerical Linear Algebra
TLDR
This work investigates least squares regression, iteratively reweighted least squares, logistic regression, robust regression with Huber and Bisquare loss functions, leverage score computation, Frobenius norm low rank approximation, and entrywise $\ell_1$-low rank approximation. Expand
Practical Sketching Algorithms for Low-Rank Matrix Approximation
TLDR
A suite of algorithms for constructing low-rank approximations of an input matrix from a random linear image, or sketch, of the matrix that can preserve structural properties of the input matrix, such as positive-semidefiniteness, and they can produce approximation with a user-specified rank. Expand
A Very Sketchy Talk
We give an overview of dimensionality reduction methods, or sketching, for a number of problems in optimization, first surveying work using these methods for classical problems, which gives nearExpand
Deterministic matrix sketches for low-rank compression of high-dimensional simulation data
TLDR
Deterministic matrix sketches which generate coarse representations – compatible with the corresponding PDE solve – are considered in the computation of the singular value decomposition and matrix interpolative decomposition. Expand
Tighter bound of Sketched Generalized Matrix Approximation
TLDR
This paper finds new sketching techniques to reduce the size of the original data matrix to develop new matrix approximation algorithms, and derives a much tighter bound for the approximation than previous works. Expand
Iterative Hessian Sketch in Input Sparsity Time
TLDR
This paper adopts "Iterative Hessian Sketching" (IHS) and shows that the fast CountSketch and sparse Johnson-Lindenstrauss Transforms yield state-of-the-art accuracy guarantees under IHS, while drastically improving the time cost. Expand
Sketching for Geometric Problems
In this invited talk at the European Symposium on Algorithms (ESA), 2017, I will discuss a tool called sketching, which is a form of data dimensionality reduction, and its applications to severalExpand
Spectral estimation from simulations via sketching
TLDR
This work shows that sketching can be used to compress simulation data and still accurately estimate time autocorrelation and power spectral density, for a given compression ratio, the accuracy is much higher than using previously known methods. Expand
Near-Optimal Algorithms for Linear Algebra in the Current Matrix Multiplication Time
TLDR
This work proposes a new transform and shows that the claimed results from the previous version can be obtained using the new transform, and gives the first optimal algorithms, for the current matrix multiplication exponent, for constant factor regression and low rank approximation. Expand
Sparse Graph Based Sketching for Fast Numerical Linear Algebra
TLDR
This paper explores two popular classes of sparse graphs, namely, expander graphs and magical graphs, and discusses the construction of sparse sketching matrices with reduced randomness using expanders based on error-correcting codes. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 154 REFERENCES
A Fast Random Sampling Algorithm for Sparsifying Matrices
TLDR
A simple random-sampling based procedure for producing sparse matrix approximations that computes the sparse matrix approximation in a single pass over the data, leading to much savings in space. Expand
Iterative Row Sampling
TLDR
This work shows that alternating between computing a short matrix estimate and finding more accurate approximate leverage scores leads to a series of geometrically smaller instances that gives an algorithm whose runtime is input sparsity plus an overhead comparable to the cost of solving a regression problem on the smaller approximation. Expand
Numerical linear algebra in the streaming model
TLDR
Near-optimal space bounds are given in the streaming model for linear algebra problems that include estimation of matrix products, linear regression, low-rank approximation, and approximation of matrix rank; results for turnstile updates are proved. Expand
Sketching Structured Matrices for Faster Nonlinear Regression
TLDR
This work considers a class of structured regression problems which involve Vandermonde matrices which arise naturally in various statistical modeling settings, and shows that this structure can be exploited to further accelerate the solution of the regression problem. Expand
On Sketching Matrix Norms and the Top Singular Vector
TLDR
This paper studies the problem of sketching matrix norms, and gives separations in the sketching complexity of Schatten-p norms with the corresponding vector p-norms, and rule out a table lookup nearest-neighbor search for p = 1, making progress on a question of Andoni. Expand
Mosaic-Skeleton approximations
If a matrix has a small rank then it can be multiplied by a vector with many savings in memory and arithmetic. As was recently shown by the author, the same applies to the matrices which might be ofExpand
A sparse Johnson: Lindenstrauss transform
TLDR
A sparse version of the fundamental tool in dimension reduction -- the Johnson-Lindenstrauss transform is obtained, using hashing and local densification to construct a sparse projection matrix with just ~O(1/ε) non-zero entries per column, and a matching lower bound on the sparsity for a large class of projection matrices is shown. Expand
Subspace Sampling and Relative-Error Matrix Approximation: Column-Row-Based Methods
TLDR
This work presents a randomized, polynomial algorithm, an extension of the recent relative error approximation algorithm for 2 regression from overconstrained problems to general 2 regression problems, which samples the columns and rows of A via the method of “subspace sampling,” so-named since the sampling probabilities depend on the lengths of the rows of the top singular vectors, and since they ensure that the authors capture entirely a certain subspace of interest. Expand
How robust are linear sketches to adaptive inputs?
TLDR
It is shown that no linear sketch approximates the Euclidean norm of its input to within an arbitrary multiplicative approximation factor on a polynomial number of adaptively chosen inputs. Expand
Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions
TLDR
This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation, and presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions. Expand
...
1
2
3
4
5
...