Feature space approximation for kernel-based supervised learning

  title={Feature space approximation for kernel-based supervised learning},
  author={Patrick Gel{\ss} and Stefan Klus and Ingmar Schuster and Christof Sch{\"u}tte},
  journal={Knowl. Based Syst.},

Figures and Tables from this paper

Kernel learning for robust dynamic mode decomposition: linear and nonlinear disambiguation optimization
This work presents a kernel method that learns interpretable data-driven models for high-dimensional, nonlinear systems and shows that it is possible to recover the linear model contribution with this approach, thus separating the effects of the implicitly defined nonlinear terms.
Approximate Bayesian Computation Based on Maxima Weighted Isolation Kernel Mapping
This work takes on the ambitious task of obtaining the coefficients of a model that reflects the relationship of driver gene mutations and cancer hallmarks on the basis of personal data regarding variant allele frequencies.


Random Features for Large-Scale Kernel Machines
Two sets of random features are explored, provided convergence bounds on their ability to approximate various radial basis kernels, and it is shown that in large-scale classification and regression tasks linear machine learning algorithms applied to these features outperform state-of-the-art large- scale kernel machines.
Tensor-based algorithms for image classification
It is shown that tensor-based methods developed for learning the governing equations of dynamical systems from data can, in the same way, be used for supervised learning problems and two novel approaches for image classification are proposed.
On the Nyström Method for Approximating a Gram Matrix for Improved Kernel-Based Learning
An algorithm to compute an easily-interpretable low-rank approximation to an n x n Gram matrix G such that computations of interest may be performed more rapidly.
Using the Nyström Method to Speed Up Kernel Machines
It is shown that an approximation to the eigendecomposition of the Gram matrix can be computed by the Nystrom method (which is used for the numerical solution of eigenproblems) and the computational complexity of a predictor using this approximation is O(m2n).
Recursive Sampling for the Nystrom Method
We give the first algorithm for kernel Nystrom approximation that runs in linear time in the number of training points and is provably accurate for all kernel matrices, without dependence on
Kernel Methods for Surrogate Modeling
This chapter deals with kernel methods as a special class of techniques for surrogate modeling, which are meshless, do not require or depend on a grid, hence are less prone to the curse of dimensionality, even for high-dimensional problems.
Kernel-Based Nonlinear Blind Source Separation
We propose kTDSEP, a kernel-based algorithm for nonlinear blind source separation (BSS). It combines complementary research fields: kernel feature spaces and BSS using temporal information. This
Nonlinear Component Analysis as a Kernel Eigenvalue Problem
A new method for performing a nonlinear form of principal component analysis by the use of integral operator kernel functions is proposed and experimental results on polynomial feature extraction for pattern recognition are presented.
A kernel-based method for data-driven koopman spectral analysis
A data-driven, kernel-based method for approximating the leading Koopman eigenvalues, eigenfunctions, and modes in problems with high-dimensional state spaces is presented, using a set of scalar observables that are defined implicitly by the feature map associated with a user-defined kernel function.
Multidimensional Approximation of Nonlinear Dynamical Systems
The method multidimensional approximation of nonlinear dynamical systems (MANDy) is proposed which combines data-driven methods with tensor network decompositions and the efficiency of the introduced approach will be illustrated with the aid of several high-dimensional nonlinear Dynamical systems.