Randomized low-rank approximation of monotone matrix functions
@article{Persson2022RandomizedLA, title={Randomized low-rank approximation of monotone matrix functions}, author={David Persson and Daniel Kressner}, journal={ArXiv}, year={2022}, volume={abs/2209.11023} }
This work is concerned with computing low-rank approximations of a matrix function f ( A ) for a large symmetric positive semi-definite matrix A , a task that arises in, e.g., statistical learning and inverse problems. The application of popular randomized methods, such as the randomized singular value decomposition or the Nystr¨om approximation, to f ( A ) requires multiplying f ( A ) with a few random vectors. A significant dis-advantage of such an approach, matrix-vector products with f ( A…
References
SHOWING 1-10 OF 49 REFERENCES
Fast and stable randomized low-rank matrix approximation
- Computer ScienceArXiv
- 2020
This work studies a generalization of Nystr{o}m method applicable to general matrices, and shows that it has near-optimal approximation quality comparable to competing methods and can significantly outperform state-of-the-art methods.
Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions
- Computer ScienceSIAM Rev.
- 2011
This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation, and presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions.
On Randomized Trace Estimates for Indefinite Matrices with an Application to Determinants
- Computer Science, MathematicsFound. Comput. Math.
- 2022
New tail bounds for randomized trace estimates applied to indefinite B with Rademacher or Gaussian random vectors are derived, which significantly improve existing results for indefinite B, reducing the number of required samples by a factor n or even more.
Hutch++: Optimal Stochastic Trace Estimation
- Computer ScienceSOSA
- 2021
A new randomized algorithm, Hutch++, is introduced, which computes a (1 ± ε) approximation to tr( A ) for any positive semidefinite (PSD) A using just O(1/ε) matrix-vector products, which improves on the ubiquitous Hutchinson's estimator.
Fixed-Rank Approximation of a Positive-Semidefinite Matrix from Streaming Data
- Computer ScienceNIPS
- 2017
A new algorithm for fixed-rank psd matrix approximation from a sketch that combines the Nystrom approximation with a novel mechanism for rank truncation and exploits the spectral decay of the input matrix.
Krylov-aware stochastic trace estimation
- Computer ScienceArXiv
- 2022
This work introduces an algorithm for estimating the trace of a matrix function f ( A ) using implicit products with a symmetric matrix A and describes a Krylov subspace method for computing a low-rank approximation of a Matrix function by a computationally efficient projection onto Kryov subspace.
Monte Carlo Methods for Estimating the Diagonal of a Real Symmetric Matrix
- Computer ScienceArXiv
- 2022
The novel use of matrix concentration inequalities in the authors' proofs represents a systematic model for future analyses and implies that the accuracy of the estimators increases with the diagonal dominance of the matrix.
Fast Estimation of tr(f(A)) via Stochastic Lanczos Quadrature
- Computer Science, MathematicsSIAM J. Matrix Anal. Appl.
- 2017
An inexpensive method to estimate the trace of f(A) for cases where f is analytic inside a closed interval and A is a symmetric positive definite matrix, which combines three key ingredients, namely, the stochastic trace estimator, Gaussian quadrature, and the Lanczos algorithm.
Fast Randomized Kernel Ridge Regression with Statistical Guarantees
- Computer ScienceNIPS
- 2015
A version of this approach that comes with running time guarantees as well as improved guarantees on its statistical performance is described, and a fast algorithm is presented to quickly compute coarse approximations to these scores in time linear in the number of samples.