# The spectral norm error of the naive Nystrom extension

@article{Gittens2011TheSN, title={The spectral norm error of the naive Nystrom extension}, author={Alex Gittens}, journal={ArXiv}, year={2011}, volume={abs/1110.5305} }

The naive Nystrom extension forms a low-rank approximation to a positive-semidefinite matrix by uniformly randomly sampling from its columns. This paper provides the first relative-error bound on the spectral norm error incurred in this process. This bound follows from a natural connection between the Nystrom extension and the column subset selection problem. The main tool is a matrix Chernoff bound for sampling without replacement.

## 67 Citations

### Randomized low-rank approximation for symmetric indefinite matrices

- Mathematics, Computer ScienceArXiv
- 2022

This work identifies the main challenges in developing a Nystr¨om approximation to symmetric indeﬁnite matrices, and establishes relative-error nuclear norm bounds of the resulting approximation that hold when the singular values decay rapidly.

### Randomized Approximation of the Gram Matrix: Exact Computation and Probabilistic Bounds

- Computer Science, MathematicsSIAM J. Matrix Anal. Appl.
- 2015

Given a real matrix A with n columns, the problem is to approximate the Gram product AA^T by c = rank(A) columns depend on the right singular vector matrix of A. For a Monte-Carlo matrix…

### Improving CUR matrix decomposition and the Nyström approximation via adaptive sampling

- Computer ScienceJ. Mach. Learn. Res.
- 2013

A more general error bound is established for the adaptive column/row sampling algorithm, based on which more accurate CUR and Nystrom algorithms with expected relative-error bounds are proposed.

### Spectral Gap Error Bounds for Improving CUR Matrix Decomposition and the Nyström Method

- Computer ScienceAISTATS
- 2015

A novel spectral gap error bounds are introduced that judiciously exploit the potentially rapid spectrum decay in the input matrix, a most common occurrence in machine learning and data analysis.

### Randomized Nystr\"om Preconditioning

- Computer Science
- 2021

Numerical tests show that Nyström PCG can rapidly solve large linear systems that arise in data analysis problems, and it surpasses several competing methods from the literature.

### Improved Bounds for the Nyström Method With Application to Kernel Classification

- Computer ScienceIEEE Transactions on Information Theory
- 2013

A kernel classification approach based on the Nyström method is presented and it is shown that when the eigenvalues of the kernel matrix follow a p-power law, the number of support vectors can be reduced to N2p/(p2 - 1), which is sublinear in N when p > 1+√2, without seriously sacrificing its generalization performance.

### Stability of Sampling for CUR Decompositions

- Mathematics, Computer ScienceFoundations of Data Science
- 2020

This article studies how to form CUR decompositions of low-rank matrices via primarily random sampling, though deterministic methods due to previous works are illustrated as well. The primary problem…

### Recursive Sampling for the Nystrom Method

- Computer ScienceNIPS
- 2017

We give the first algorithm for kernel Nystrom approximation that runs in linear time in the number of training points and is provably accurate for all kernel matrices, without dependence on…

### Fast and stable randomized low-rank matrix approximation

- Computer ScienceArXiv
- 2020

This work studies a generalization of Nystr{o}m method applicable to general matrices, and shows that it has near-optimal approximation quality comparable to competing methods and can significantly outperform state-of-the-art methods.

### A perturbation based out-of-sample extension framework

- Computer ScienceArXiv
- 2020

It is proved that the perturbation based extension framework derived by this paper generalizes the well-known Nystr{o}m method as well as some of its variants and suggests new forms of extension under this framework that take advantage of the structure of the kernel matrix.

## References

SHOWING 1-10 OF 20 REFERENCES

### A novel greedy algorithm for Nyström approximation

- Computer ScienceAISTATS
- 2011

A novel recursive algorithm for calculating the Nystrom approximation, and an effective greedy criterion for column selection are presented, and a very efficient variant is proposed for greedy sampling, which works on random partitions of data instances.

### Spectral approximations in machine learning

- Computer Science
- 2011

Two methods for reducing the computational burden of spectral decompositions are discussed: the more venerable Nystom extension and a newly introduced algorithm based on random projections.

### On sampling-based approximate spectral decomposition

- Computer ScienceICML '09
- 2009

This paper addresses the problem of approximate singular value decomposition of large dense matrices that arises naturally in many machine learning applications and proposes an efficient adaptive sampling technique to select informative columns from the original matrix.

### Matrix Coherence and the Nystrom Method

- Computer ScienceUAI
- 2010

This work derives novel coherence-based bounds for the Nystrom method in the low-rank setting and presents empirical results that corroborate these theoretical bounds and convincingly demonstrate the ability of matrix coherence to measure the degree to which information can be extracted from a subset of columns.

### Making Large-Scale Nyström Approximation Possible

- Computer ScienceICML
- 2010

An accurate and scalable Nystrom scheme that first samples a large column subset from the input matrix, but then only performs an approximate SVD on the inner submatrix by using the recent randomized low-rank matrix approximation algorithms.

### Sampling Techniques for the Nystrom Method

- Computer ScienceAISTATS
- 2009

This work presents novel experiments with several real world datasets, and suggests that uniform sampling without replacement, in addition to being more efficient both in time and space, produces more effective approximations.

### On the Nyström Method for Approximating a Gram Matrix for Improved Kernel-Based Learning

- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2005

An algorithm to compute an easily-interpretable low-rank approximation to an n x n Gram matrix G such that computations of interest may be performed more rapidly.

### Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions

- Computer ScienceSIAM Rev.
- 2011

This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation, and presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions.

### Spectral methods in machine learning and new strategies for very large datasets

- Computer ScienceProceedings of the National Academy of Sciences
- 2009

Two new algorithms for the approximation of positive-semidefinite kernels based on the Nyström method are presented, each of which demonstrates the improved performance of the approach relative to existing methods.

### Improved Analysis of the subsampled Randomized Hadamard Transform

- Computer Science, MathematicsAdv. Data Sci. Adapt. Anal.
- 2011

An improved analysis of a structured dimension-reduction map called the subsampled randomized Hadamard transform is presented, and it offers optimal constants in the estimate on the number of dimensions required for the embedding.