Learn More
It is shown how to choose the smoothing parameter when a smoothing periodic spline of degree 2m-1 is used to reconstruct a smooth periodic curve from noisy ordinate data. The noise is assumed "white", and the true curve is assumed to be in the Sobolev space W,(*ra) of periodic functions with absolutely continuous v-th derivative, v= 0, t ..... 2m-I and(More)
Two-category support vector machines (SVM) have been very popular in the machine learning community for classii cation problems. Solving multicategory problems by a series of binary classii ers is quite common in the SVM paradigm; however, this approach may fail under various circumstances. We propose the multicategory support vector machine (MSVM), which(More)
The Support Vector Machine (SVM) has shown great performance in practice as a classification methodology. Oftentimes multicategory problems have been treated as a series of binary problems in the SVM paradigm. Even though the SVM implements the optimal classification rule asymptotically in the binary case, solutions to a series of binary problems may not be(More)
This chapter is an expanded version of a talk presented in the NIPS 97 Workshop on Support Vector Machines. It consists of three parts: (1) A brief review of some old but relevant results on constrained optimization in Reproducing Kernel Hilbert Spaces (RKHS), and a review of the relationship between zero-mean Gaussian processes and RKHS. Application of(More)
In this paper, we propose a Generalized Approximate Cross Validation (GACV) function for estimating the smoothing parameter in the penalized log likelihood regression problem with non-Gaussian data. This GACV is obtained by, first, obtaining an approximation to the leaving-out-one function based on the negative log likelihood, and then, in a step(More)
An adaptive spline method for smoothing is proposed which combines features from both regression spline and smoothing spline approaches. One of its advantages is the ability to vary the amount of smoothing in response to the inhomogeneous \curvature" of true functions at diierent locations. This method can be applied to many multivariate function estimation(More)
The majority of classification algorithms are developed for the standard situation in which it is assumed that the examples in the training set come from the same distribution as that of the target population, and that the cost of misclassification into different classes are the same. However, these assumptions are often violated in real world settings. For(More)
Reproducing kernel Hilbert space (RKHS) methods provide a unified context for solving a wide variety of statistical modelling and function estimation problems. We consider two such problems: We are given a training set [yi, ti, i = 1, em leader, n], where yi is the response for the ith subject, and ti is a vector of attributes for this subject. The value of(More)
Let y i ; i = 1; ; n be independent observations with the density of y i of the form h(y i ; f i) = expy i f i ?b(f i)+c(y i)], where b and c are given functions and b is twice continuously diierentiable and bounded away from 0. Let f i = f(t(i)), where t = (t 1 ; ; t d) 2 T (1) T (d) = T , the T () are measureable spaces of rather general form, and f is an(More)
The (modified) Newton method is adapted to optimize generalized cross validation (GCV) and generalized maximum likelihood (GML) scores with multiple smoothing parameters. The main concerns in solving the optimization problem are the speed and the reliability of the algorithm, as well as the invariance of the algorithm under transformations under which the(More)