#### Filter Results:

- Full text PDF available (68)

#### Publication Year

1974

2017

- This year (2)
- Last 5 years (18)
- Last 10 years (41)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

Among other things, we prove that multiquadric surface interpolation is always solvable, thereby settling a conjecture of R. Franke.

- Theodoros Evgeniou, Charles A. Micchelli, Massimiliano Pontil
- Journal of Machine Learning Research
- 2005

We study the problem of learning many related tasks simultaneously using kernel methods and regularization. The standard single-task kernel methods, such as support vector machines and regularization networks, are extended to the case of multi-task learning. Our analysis shows that the problem of estimating many task functions with regularization can be… (More)

- Charles A. Micchelli, Massimiliano Pontil
- Neural Computation
- 2005

In this letter, we provide a study of learning in a Hilbert space of vectorvalued functions. We motivate the need for extending learning theory of scalar-valued functions by practical considerations and establish some basic results for learning vector-valued functions that should prove useful in applications. Specifically, we allow an output space Y to be a… (More)

- Charles A. Micchelli, Massimiliano Pontil
- Journal of Machine Learning Research
- 2005

We study the problem of finding an optimal kernel from a prescribed convex set of kernels K for learning a real-valued function by regularization. We establish for a wide variety of regularization functionals that this leads to a convex optimization problem and, for square loss regularization, we characterize the solution of this problem. We show that,… (More)

We study the problem of learning a kernel which minimizes a regularization error functional such as that used in regularization networks or support vector machines. We consider this problem when the kernel is in the convex hull of basic kernels, for example, Gaussian kernels which are continuously parameterized by a compact set. We show that there always… (More)

Learning the common structure shared by a set of supervised tasks is an important practical and theoretical problem. Knowledge of this structure may lead to better generalization performance on the tasks and may also facilitate learning new tasks. We propose a framework for solving this problem, which is based on regularization with spectral functions of… (More)

We address the problem of learning a kernel for a given supervised learning task. Our approach consists in searching within the convex hull of a prescribed set of basic kernels for one which minimizes a convex regularization functional. A unique feature of this approach compared to others in the literature is that the number of basic kernels can be… (More)

- Charles A. Micchelli, Massimiliano Pontil
- NIPS
- 2004

This paper provides a foundation for multi–task learning using reproducing kernel Hilbert spaces of vector–valued functions. In this setting, the kernel is a matrix–valued function. Some explicit examples will be described which go beyond our earlier results in [7]. In particular, we characterize classes of matrix– valued kernels which are linear and are of… (More)

We study basic questions of wavelet decompositions associated with multiresolution analysis. A rather complete analysis of multiresolution associated with the solution of a refinement equation is presented. The notion of extensibility of a finite set of Laurent polynomials is shown to be central in the construction of wavelets by decomposition of spaces.… (More)

We investigate linear independence of integer translates of a finite number of compactly supported functions in two cases. In the first case there are no restrictions on the coefficients that may occur in dependence relations. In the second case the coefficient sequences are restricted to be in some ` space (1 ≤ p ≤ ∞) and we are interested in bounding… (More)