# Nonlinear Component Analysis as a Kernel Eigenvalue Problem

@article{Schlkopf1998NonlinearCA, title={Nonlinear Component Analysis as a Kernel Eigenvalue Problem}, author={Bernhard Sch{\"o}lkopf and Alex Smola and Klaus-Robert M{\"u}ller}, journal={Neural Computation}, year={1998}, volume={10}, pages={1299-1319} }

A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some nonlinear mapfor instance, the space of all possible five-pixel products in 16 16 images. We give the derivation of the method and present experimental results on polynomial feature extraction for pattern recognition.

## 8,053 Citations

### Kernel Principal Component Analysis

- MathematicsICANN
- 1997

A new method for performing a nonlinear form of Principal Component Analysis by the use of integral operator kernel functions is proposed and experimental results on polynomial feature extraction for pattern recognition are presented.

### An Expectation-Maximization Approach to Nonlinear Component Analysis

- Computer ScienceNeural Computation
- 2001

This work proposes an expectation-maximization approach for performing kernel principal component analysis and shows this to be a computationally efficient method, especially when the number of data points is large.

### Kernel Hebbian Algorithm for Iterative Kernel Principal Component Analysis

- Computer Science
- 2003

A new method for performing a kernel principal component analysis is proposed. By kernelizing the generalized Hebbian algorithm, one can iteratively estimate the principal components in a reproducing…

### Face recognition using kernel principal component analysis

- Computer ScienceIEEE Signal Process. Lett.
- 2002

Through adopting a polynomial kernel, the principal components can be computed within the space spanned by high-order correlations of input pixels making up a facial image, thereby producing a good performance.

### Face recognition using kernel principal component analysis

- Computer ScienceIEEE Signal Processing Letters
- 2002

Through adopting a polynomial kernel, the principal components can be computed within the space spanned by high-order correlations of input pixels making up a facial image, thereby producing a good performance.

### Orthogonal Series Density Estimation and the Kernel Eigenvalue Problem

- MathematicsNeural Computation
- 2002

The view is presented that the eigenvalue decomposition of a kernel matrix can also provide the discrete expansion coefficients required for a nonparametric orthogonal series density estimator.

### Fast Independent Component Analysis in Kernel Feature Spaces

- MathematicsSOFSEM
- 2001

Nonlinearized formulae are furnished along with an illustration of the usefulness of the proposed method as an unsupervised feature extractor for the classification of Hungarian phonemes.

### Kernel Principal Component Regression in Reproducing Kernel Hilbert Space

- Computer Science
- 2003

KPCA is applied for feature selection in a high-dimensional feature space which is nonlinearly mapped from an input space by a Gaussian kernel function.

### Sparse Kernel Principal Component Analysis

- Computer ScienceNIPS
- 2000

By approximating the covariance matrix in feature space by a reduced number of example vectors, using a maximum-likelihood approach, it is shown that a highly sparse form of kernel PCA can be obtained without loss of effectiveness.

### Online identification of nonlinear system using reduced kernel principal component analysis

- Computer ScienceNeural Computing and Applications
- 2010

This paper proposes a new method for online identification of a nonlinear system modelled on Reproducing Kernel Hilbert Space (RKHS), which is tuned twice to exploit the Kernel PCA and use the Reduced Kernel Principal Component Analysis to update the principal components that represent the observations selected by the KPCA method.

## References

SHOWING 1-10 OF 46 REFERENCES

### Using Discriminant Eigenfeatures for Image Retrieval

- Computer ScienceIEEE Trans. Pattern Anal. Mach. Intell.
- 1996

This paper describes the automatic selection of features from an image training set using the theories of multidimensional discriminant analysis and the associated optimal linear projection. We…

### RBF principal manifolds for process monitoring

- MathematicsIEEE Trans. Neural Networks
- 1999

This paper describes a novel means for creating a nonlinear extension of principal component analysis (PCA) using radial basis function (RBF) networks. This algorithm comprises two distinct stages:…

### Principal Component Neural Networks: Theory and Applications

- Computer Science
- 1996

A review of Linear Algebra, Principal Component Analysis, and VLSI Implementation.

### Principal Component Analysis

- Mathematics, GeologyInternational Encyclopedia of Statistical Science
- 1986

Introduction * Properties of Population Principal Components * Properties of Sample Principal Components * Interpreting Principal Components: Examples * Graphical Representation of Data Using…

### Methods of Mathematical Physics

- Education
- 1947

Partial table of contents: THE ALGEBRA OF LINEAR TRANSFORMATIONS AND QUADRATIC FORMS. Transformation to Principal Axes of Quadratic and Hermitian Forms. Minimum-Maximum Property of Eigenvalues.…

### Eigenfaces for Recognition

- Computer ScienceJournal of Cognitive Neuroscience
- 1991

A near-real-time computer system that can locate and track a subject's head, and then recognize the person by comparing characteristics of the face to those of known individuals, and that is easy to implement using a neural network architecture.

### Efficient Pattern Recognition Using a New Transformation Distance

- Computer ScienceNIPS
- 1992

A new distance measure which can be made locally invariant to any set of transformations of the input and can be computed efficiently is proposed.

### Input space versus feature space in kernel-based methods

- Computer ScienceIEEE Trans. Neural Networks
- 1999

The geometry of feature space is reviewed, and the connection between feature space and input space is discussed by dealing with the question of how one can, given some vector in feature space, find a preimage in input space.

### Application of the Karhunen-Loeve Procedure for the Characterization of Human Faces

- Computer ScienceIEEE Trans. Pattern Anal. Mach. Intell.
- 1990

The use of natural symmetries (mirror images) in a well-defined family of patterns (human faces) is discussed within the framework of the Karhunen-Loeve expansion. This results in an extension of the…

### Simplified Support Vector Decision Rules

- Computer ScienceICML
- 1996

The results show that the method can decrease the computational complexity of the decision rule by a factor of ten with no loss in generalization perfor mance making the SVM test speed com petitive with that of other methods.