Jackknife Variability Estimation For Randomized Matrix Computations

  title={Jackknife Variability Estimation For Randomized Matrix Computations},
  author={E.N. Epperly and Joel A. Tropp},
. Randomized algorithms based on sketching have become a workhorse tool in low-rank matrix approximation. To use these algorithms safely in applications, they should be coupled with diagnostics to assess the quality of approximation. To meet this need, this paper proposes a jackknife resampling method to estimate the variability of the output of a randomized matrix computation. The variability estimate can recognize that a computation requires additional data or that the computation is… 

Figures from this paper



A Bootstrap Method for Error Estimation in Randomized Matrix Multiplication

This paper develops a bootstrap method for directly estimating the accuracy as a function of the reduced dimension, and provides both theoretical and empirical results to demonstrate the effectiveness of the proposed method.

Revisiting the Nystrom Method for Improved Large-scale Machine Learning

An empirical evaluation of the performance quality and running time of sampling and projection methods on a diverse suite of SPSD matrices and a suite of worst-case theoretical bounds for both random sampling and random projection methods are complemented.

Error Estimation for Sketched SVD via the Bootstrap

This paper develops a fully data-driven bootstrap method that numerically estimates the actual error of sketched singular vectors/values, and allows the user to inspect the quality of a rough initial sketched SVD, and then adaptively predict how much extra work is needed to reach a given error tolerance.

Streaming Low-Rank Matrix Approximation with an Application to Scientific Simulation

It is argued that randomized linear sketching is a natural tool for on-the-fly compression of data matrices that arise from large-scale scientific simulations and data collection and is less sensitive to parameter choices than previous techniques.

On the Nyström Method for Approximating a Gram Matrix for Improved Kernel-Based Learning

An algorithm to compute an easily-interpretable low-rank approximation to an n x n Gram matrix G such that computations of interest may be performed more rapidly.

Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions

This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation, and presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions.

Randomized numerical linear algebra: Foundations and algorithms

This survey describes probabilistic algorithms for linear algebraic computations, such as factorizing matrices and solving linear systems, that have a proven track record for real-world problems and treats both the theoretical foundations of the subject and practical computational issues.

Fixed-Rank Approximation of a Positive-Semidefinite Matrix from Streaming Data

A new algorithm for fixed-rank psd matrix approximation from a sketch that combines the Nystrom approximation with a novel mechanism for rank truncation and exploits the spectral decay of the input matrix.

RandNLA: randomized numerical linear algebra

RandNLA is an interdisciplinary research area that exploits randomization as a computational resource to develop improved algorithms for large-scale linear algebra problems and promises a sound algorithmic and statistical foundation for modern large- scale data analysis.

Sketching as a Tool for Numerical Linear Algebra

This survey highlights the recent advances in algorithms for numericallinear algebra that have come from the technique of linear sketching, and considers least squares as well as robust regression problems, low rank approximation, and graph sparsification.