• Corpus ID: 229331976

Upper and Lower Bounds on the Performance of Kernel PCA

@article{Haddouche2020UpperAL,
  title={Upper and Lower Bounds on the Performance of Kernel PCA},
  author={Maxime Haddouche and Benjamin Guedj and Omar Rivasplata and John Shawe-Taylor},
  journal={ArXiv},
  year={2020},
  volume={abs/2012.10369}
}
: Principal Component Analysis (PCA) is a popular method for dimension reduction and has attracted an unfailing interest for decades. More recently, kernel PCA (KPCA) has emerged as an extension of PCA but, despite its use in practice, a sound theoretical understanding of KPCA is missing. We contribute several lower and upper bounds on the efficiency of KPCA, involving the empirical eigenvalues of the kernel Gram matrix and new quantities involving a notion of variance. These bounds show how much… 

Figures from this paper

References

SHOWING 1-10 OF 51 REFERENCES

Statistical properties of kernel principal component analysis

This work focuses on Kernel Principal Component Analysis (KPCA) and obtains sharp excess risk bounds for the reconstruction error using local Rademacher averages and the dependence on the decay of the spectrum and on the closeness of successive eigenvalues is made explicit.

On the eigenspectrum of the gram matrix and the generalization error of kernel-PCA

The differences between the two spectra are bounded and a performance bound on kernel principal component analysis (PCA) is provided showing that good performance can be expected even in very-high-dimensional feature spaces provided the sample eigenvalues fall sufficiently quickly.

Nonasymptotic upper bounds for the reconstruction error of PCA

We analyse the reconstruction error of principal component analysis (PCA) and prove non-asymptotic upper bounds for the corresponding excess risk. These bounds unify and improve existing upper bounds

Empirical Bernstein Bounds and Sample-Variance Penalization

Improved constants for data dependent and variance sensitive confidence bounds are given, called empirical Bernstein bounds, and extended to hold uniformly over classes of functions whose growth function is polynomial in the sample size n, and sample variance penalization is considered.

A Simple and Fast Algorithm for L1-Norm Kernel PCA

A novel reformulation of L1-norm kernel PCA is provided through which an equivalent, geometrically interpretable problem is obtained and a “fixed-point” type algorithm that iteratively computes a binary weight for each observation is presented.

Machine Learning-Based Reduced Kernel PCA Model for Nonlinear Chemical Process Monitoring

A reduced KPCA (RKPCA) for fault detection of chemical processes is developed, a novel machine learning tool which merges dimensionality reduction, supervised learning as well as kernel selection, and the performance of the proposed process monitoring technique is illustrated.

A Tutorial Review of RKHS Methods in Machine Learning

The present review aims to summarize the state of the art on a conceptual level for positive definite kernel estimation methods by building on various sources and adding a fair amount of recent material which helps unifying the exposition.

User-friendly introduction to PAC-Bayes bounds

This paper describes a simplified version of the localization technique of [34, 36] that was missed by the community, and later rediscovered as “mutual information bounds” and is an attempt to provide an elementary introduction to PAC-Bayes theory.

Kernels, Associated Structures and Generalizations

This paper gives a survey of results in the mathematical literature on positive definite kernels and their associated structures and presents the general framework of Hilbertian subspaces of Schwartz which is used to introduce kernels which are distributions.

Local Rademacher complexities

New bounds on the error of learning algorithms in terms of a data-dependent notion of complexity are proposed and some applications to classification and prediction with convex function classes, and with kernel classes in particular are presented.
...