Corpus ID: 237491530

Kernel PCA with the Nystr\"om method

@inproceedings{Hallgren2021KernelPW,
  title={Kernel PCA with the Nystr\"om method},
  author={Fredrik Hallgren},
  year={2021}
}
Kernel methods are powerful but computationally demanding techniques for non-linear learning. A popular remedy, the Nyström method has been shown to be able to scale up kernel methods to very large datasets with little loss in accuracy. However, kernel PCA with the Nyström method has not been widely studied. In this paper we derive kernel PCA with the Nyström method and study its accuracy, providing a finite-sample confidence bound on the difference between the Nyström and standard empirical… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 80 REFERENCES
Gain with no Pain: Efficiency of Kernel-PCA by Nyström Sampling
TLDR
This analysis shows that Nyström sampling greatly improves computational efficiency without incurring any loss of statistical accuracy in kernel PCA, the first such result for PCA. Expand
Statistical Optimality and Computational Efficiency of Nyström Kernel PCA
TLDR
This work theoretically study the trade-off between computational complexity and statistical accuracy in Nyström approximate kernel principal component analysis (KPCA), wherein it is shown that the Nystr Öm approximate KPCA matches the statistical performance of (non-approximate) K PCA while remaining computationally beneficial. Expand
Nyström-based approximate kernel subspace learning
TLDR
A method for the determination of a subspace of the feature space in kernel methods, which is suited to large-scale learning problems and can be used for generic nonlinear pattern recognition. Expand
Kernel Principal Component Regression with EM Approach to Nonlinear Principal Components Extraction
In kernel based methods such as Support Vector Machines, Kernel PCA, Gaussian Processes or Regularization Networks the computational requirements scale as O(n3) where n is the number of trainingExpand
FALKON: An Optimal Large Scale Kernel Method
TLDR
This paper proposes FALKON, a novel algorithm that allows to efficiently process millions of points, derived combining several algorithmic principles, namely stochastic subsampling, iterative solvers and preconditioning. Expand
Upper and Lower Bounds on the Performance of Kernel PCA
TLDR
Lower and upper bounds on the efficiency of kernel PCA are contributed, involving the empirical eigenvalues of the kernel Gram matrix, and two bounds are for fixed estimators and two are for randomized estimators through the PAC-Bayes theory. Expand
Less is More: Nyström Computational Regularization
TLDR
A simple incremental variant of Nystrom Kernel Regularized Least Squares is suggested, where the subsampling level implements a form of computational regularization, in the sense that it controls at the same time regularization and computations. Expand
Statistical Properties of Kernel Principal Component Analysis
TLDR
This work focuses on Kernel Principal Component Analysis (KPCA) and obtains sharp excess risk bounds for the reconstruction error using local Rademacher averages and the dependence on the decay of the spectrum and on the closeness of successive eigenvalues is made explicit. Expand
Kernel Basis Pursuit
TLDR
The Kernel Basis Pursuit algorithm is introduced, which enables us to build a L 1 -regularized-multiple-kernel estimator, and a fast parameter-free method to estimate non-uniform-sampled functions is proposed. Expand
Linearized Kernel Dictionary Learning
TLDR
A new approach of incorporating kernels into dictionary learning by approximate the kernel matrix using a cleverly sampled subset of its columns using the Nyström method, and decompose it by SVD to form new “virtual samples,” on which any linear dictionary learning can be employed. Expand
...
1
2
3
4
5
...