Fixed-Rank Approximation of a Positive-Semidefinite Matrix from Streaming Data
@article{Tropp2017FixedRankAO, title={Fixed-Rank Approximation of a Positive-Semidefinite Matrix from Streaming Data}, author={Joel A. Tropp and Alp Yurtsever and Madeleine Udell and Volkan Cevher}, journal={ArXiv}, year={2017}, volume={abs/1706.05736} }
Several important applications, such as streaming PCA and semidefinite programming, involve a large-scale positive-semidefinite (psd) matrix that is presented as a sequence of linear updates. Because of storage limitations, it may only be possible to retain a sketch of the psd matrix. This paper develops a new algorithm for fixed-rank psd approximation from a sketch. The approach combines the Nystrom approximation with a novel mechanism for rank truncation. Theoretical analysis establishes that…
Figures from this paper
52 Citations
Randomized low-rank approximation for symmetric indefinite matrices
- Mathematics, Computer ScienceArXiv
- 2022
This work identifies the main challenges in developing a Nystr¨om approximation to symmetric indefinite matrices, and establishes relative-error nuclear norm bounds of the resulting approximation that hold when the singular values decay rapidly.
Fast and stable randomized low-rank matrix approximation
- Computer ScienceArXiv
- 2020
This work studies a generalization of Nystr{o}m method applicable to general matrices, and shows that it has near-optimal approximation quality comparable to competing methods and can significantly outperform state-of-the-art methods.
Randomized low-rank approximation of monotone matrix functions
- Computer Science, MathematicsArXiv
- 2022
This work presents and analyzes funNystr¨om, a simple and inexpensive method that constructs a low-rank approximation of f ( A ) directly from a NyStr¨om approximation of A, completely bypassing the need for matrix-vector products with f (A ).
Streaming Low-Rank Matrix Approximation with an Application to Scientific Simulation
- Computer ScienceSIAM J. Sci. Comput.
- 2019
It is argued that randomized linear sketching is a natural tool for on-the-fly compression of data matrices that arise from large-scale scientific simulations and data collection and is less sensitive to parameter choices than previous techniques.
NysADMM: faster composite convex optimization via low-rank approximation
- Computer ScienceICML
- 2022
The breadth of problems on which NysADMM beats standard solvers is a surprise and suggests that ADMM is a dominant paradigm for numerical optimization across a wide range of statistical learning problems that are usually solved with bespoke methods.
Improved Fixed-Rank Nyström Approximation via QR Decomposition: Practical and Theoretical Aspects
- Computer ScienceNeurocomputing
- 2019
Randomized Nystr\"om Preconditioning
- Computer Science
- 2021
Numerical tests show that Nyström PCG can rapidly solve large linear systems that arise in data analysis problems, and it surpasses several competing methods from the literature.
Randomly pivoted Cholesky: Practical approximation of a kernel matrix with few entry evaluations
- Computer ScienceArXiv
- 2022
Empirically, RPC HOLESKY matches or improves on the performance of alternative algorithms for low-rank psd approximation and provably achieves near-optimal approximation guarantees.
Randomized Clustered Nystrom for Large-Scale Kernel Machines
- Computer ScienceAAAI
- 2018
This paper introduces a randomized algorithm for generating landmark points that is scalable to large high-dimensional data sets and performs K-means clustering on low-dimensional random projections of a data set and thus leads to significant savings for high- dimensional data sets.
Structured Stochastic Quasi-Newton Methods for Large-Scale Optimization Problems
- Computer Science
- 2020
Numerical experiments on logistic regression, deep autoencoder networks and deep learning problems show that the efficiency of the proposed method is at least comparable with the state-of-the-art methods.
References
SHOWING 1-10 OF 44 REFERENCES
Randomized single-view algorithms for low-rank matrix approximation
- Computer ScienceArXiv
- 2016
A suite of algorithms for constructing low-rank approximations of an input matrix from a random linear image of the matrix, called a sketch, that can preserve structural properties of the input matrix, such as positive-semidefiniteness, and produce approximation with a user-specified rank are developed.
Numerical linear algebra in the streaming model
- Computer ScienceSTOC '09
- 2009
Near-optimal space bounds are given in the streaming model for linear algebra problems that include estimation of matrix products, linear regression, low-rank approximation, and approximation of matrix rank; results for turnstile updates are proved.
Revisiting the Nystrom Method for Improved Large-scale Machine Learning
- Computer ScienceJ. Mach. Learn. Res.
- 2016
An empirical evaluation of the performance quality and running time of sampling and projection methods on a diverse suite of SPSD matrices and a suite of worst-case theoretical bounds for both random sampling and random projection methods are complemented.
Subspace Iteration Randomization and Singular Value Problems
- Computer ScienceSIAM J. Sci. Comput.
- 2015
A novel error analysis is presented that considers randomized algorithms within the subspace iteration framework and shows with very high probability that highly accurate low-rank approximations as well as singular values can be computed.
Sketchy Decisions: Convex Low-Rank Matrix Optimization with Optimal Storage
- Computer ScienceAISTATS
- 2017
This paper proposes the first algorithm to offer provable convergence to an optimal point with an optimal memory footprint, and modifies a standard convex optimization method to work on a sketched version of the decision variable, and can recover the solution from this sketch.
Topics in randomized numerical linear algebra
- Computer Science
- 2013
The matrix Laplace transform framework is extended to derive Chernoff and Bernstein inequalities that apply to all the eigenvalues of certain classes of random matrices, and to derive convergence rates for each individual eigenvalue of a sample covariance matrix.
Dimensionality Reduction for k-Means Clustering and Low Rank Approximation
- Computer ScienceSTOC
- 2015
This work shows how to approximate a data matrix A with a much smaller sketch ~A that can be used to solve a general class of constrained k-rank approximation problems to within (1+ε) error, and gives a simple alternative to known algorithms that has applications in the streaming setting.
Low-Rank PSD Approximation in Input-Sparsity Time
- Computer Science, MathematicsSODA
- 2017
This work gives algorithms for approximation by low-rank positive semidefinite (PSD) matrices, and shows that there are asymmetric input matrices that cannot have good symmetric column-selected approximations.
Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions
- Computer ScienceSIAM Rev.
- 2011
This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation, and presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions.
Randomized Clustered Nystrom for Large-Scale Kernel Machines
- Computer ScienceAAAI
- 2018
This paper introduces a randomized algorithm for generating landmark points that is scalable to large high-dimensional data sets and performs K-means clustering on low-dimensional random projections of a data set and thus leads to significant savings for high- dimensional data sets.