Kernels for Vector-Valued Functions: a Review

@article{lvarez2012KernelsFV,
  title={Kernels for Vector-Valued Functions: a Review},
  author={M. {\'A}lvarez and L. Rosasco and Neil Lawrence},
  journal={Found. Trends Mach. Learn.},
  year={2012},
  volume={4},
  pages={195-266}
}
Kernel methods are among the most popular techniques in machine learning. From a regularization perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a probabilistic perspective they are the key in the context of Gaussian processes, where the kernel function is known as the covariance function. Traditionally, kernel methods have been… Expand
Multi-task Learning in vector-valued reproducing kernel Banach spaces with the ℓ1 norm
TLDR
This work constructs a class of vector-valued reproducing kernel Banach spaces with the l 1 norm to construct multi-task admissible kernels so that the constructed spaces could have desirable properties including the crucial linear representer theorem. Expand
Online Learning with Multiple Operator-valued Kernels
TLDR
Two online algorithms for learning a vector-valued function f while taking into account the output structure are described, one of which extends the standard kernel-based online learning algorithm NORMA from scalar-valued to operator-valued setting and the other addresses the limitation of pre-defining theoutput structure in ONORMA by learning sequentially a linear combination of operator- valued kernels. Expand
Random Fourier Features For Operator-Valued Kernels
TLDR
A general principle for Operator-valued Random Fourier Feature construction relies on a generalization of Bochner's theorem for translation-invariant operator-valued Mercer kernels and proves the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features. Expand
Large-scale operator-valued kernel regression
TLDR
This thesis proposes and study scalable methods to perform regression with Operator-Valued Kernels, and develops a general framework devoted to the approximation of shift-invariant MErcer kernels on Locally Compact Abelian groups. Expand
Multi-task Learning in Vector-valued Reproducing Kernel Banach Spaces with the $\ell^1$ Norm.
Targeting at sparse multi-task learning, we consider regularization models with an $\ell^1$ penalty on the coefficients of kernel functions. In order to provide a kernel method for this model, weExpand
Kernel Mean Embedding of Distributions: A Review and Beyonds
TLDR
A comprehensive review of existing work and recent advances in the Hilbert space embedding of distributions, and to discuss the most challenging issues and open problems that could lead to new research directions. Expand
Operator-valued Kernels for Learning from Functional Response Data
In this paper we consider the problems of supervised classification and regression in the case where attributes and labels are functions: a data is represented by a set of functions, and the label isExpand
Learning with Operator-valued Kernels in Reproducing Kernel Krein Spaces
TLDR
This work considers operator-valued kernels which might not be necessarily positive definite, and an iterative Operator based Minimum Residual (OpMINRES) algorithm is proposed for solving the loss stabilization problem. Expand
Online Learning with Operator-valued Kernels
TLDR
An online algorithm, OLOK, is described that extends the standard kernel-based online learning algorithm NORMA from scalar-valued to operator-valued setting and reports a cumulative error bound that holds both for classification and regression. Expand
On the Dualization of Operator-Valued Kernel Machines
TLDR
This work investigates how to use the duality principle to handle different families of loss functions, yet unexplored within vv-RKHSs. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 138 REFERENCES
Kernels for Multi--task Learning
TLDR
This paper provides a foundation for multi-task learning using reproducing kernel Hilbert spaces of vector-valued functions using classes of matrix- valued kernels which are linear and are of the dot product or the translation invariant type. Expand
Universal Multi-Task Kernels
TLDR
The primary goal here is to derive conditions which ensure that the kernel K is universal, which means that on every compact subset of the input space, every continuous function with values in Y can be uniformly approximated by sections of the kernel. Expand
On Learning Vector-Valued Functions
TLDR
This letter provides a study of learning in a Hilbert space of vector-valued functions and derives the form of the minimal norm interpolant to a finite set of data and applies it to study some regularization functionals that are important in learning theory. Expand
Characterizing the Function Space for Bayesian Kernel Models
TLDR
A coherent Bayesian kernel model based on an integral operator defined as the convolution of a kernel with a signed measure is studied, creating a function theoretic foundation for using non-parametric prior specifications in Bayesian modeling, such as Gaussian process and Dirichlet process prior distributions. Expand
Learning Multiple Tasks with Kernel Methods
TLDR
The experiments show that learning multiple related tasks simultaneously using the proposed approach can significantly outperform standard single-task learning particularly when there are many related tasks but few data per task. Expand
Multi-output learning via spectral filtering
TLDR
A finite sample bound for the excess risk of the obtained estimator allows to prove consistency both for regression and multi-category classification and some promising results of the proposed algorithms are presented. Expand
Some Properties of Regularized Kernel Methods
TLDR
A quantitative version of the representer theorem holding for both regression and classification, for both differentiable and non-differentiable loss functions, and for arbitrary offset terms is proved. Expand
Kernel Methods for Pattern Analysis
TLDR
The lectures will introduce the kernel methods approach to pattern analysis through the particular example of support vector machines for classification and argue that, ignoring the technical requirement of positive semi-definiteness, kernel design is not an unnatural task for a practitioner. Expand
Clustered Multi-Task Learning: A Convex Formulation
TLDR
A new spectral norm is designed that encodes this a priori assumption that tasks are clustered into groups, which are unknown beforehand, and that tasks within a group have similar weight vectors, resulting in a new convex optimization formulation for multi-task learning. Expand
Gaussian Processes for Machine Learning
TLDR
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification. Expand
...
1
2
3
4
5
...