Learn More
It is shown how to choose the smoothing parameter when a smoothing periodic spline of degree 2m-1 is used to reconstruct a smooth periodic curve from noisy ordinate data. The noise is assumed "white", and the true curve is assumed to be in the Sobolev space W,(*ra) of periodic functions with absolutely continuous v-th derivative, v= 0, t ..... 2m-I and(More)
1 This is a revision of TR1064, September, 2002. Some initial expository material has been removed at the request of a referee. There are some new results presented concerning the relations between the Bayes rule and other proposed SVM methods for the multicategory case. See especially the paragraph containing Eq (9) on p 8. Abstract Two category Support(More)
The Support Vector Machine (SVM) has shown great performance in practice as a classification methodology. Oftentimes multicategory problems have been treated as a series of binary problems in the SVM paradigm. Even though the SVM implements the optimal classification rule asymptotically in the binary case, solutions to a series of binary problems may not be(More)
In this paper, we propose a Generalized Approximate Cross Validation (GACV) function for estimating the smoothing parameter in the penalized log likelihood regression problem with non-Gaussian data. This GACV is obtained by, first, obtaining an approximation to the leaving-out-one function based on the negative log likelihood, and then, in a step(More)
This chapter is an expanded version of a talk presented in the NIPS 97 Workshop on Support Vector Machines. It consists of three parts: (1) A brief review of some old but relevant results on constrained optimization in Reproducing Kernel Hilbert Spaces (RKHS), and a review of the relationship between zero-mean Gaussian processes and RKHS. Application of(More)
The majority of classification algorithms are developed for the standard situation in which it is assumed that the examples in the training set come from the same distribution as that of the target population, and that the cost of misclassification into different classes are the same. However, these assumptions are often violated in real world settings. For(More)
  • Grace Wahba
  • 2002
Reproducing kernel Hilbert space (RKHS) methods provide a unified context for solving a wide variety of statistical modelling and function estimation problems. We consider two such problems: We are given a training set [yi, ti, i = 1, em leader, n], where yi is the response for the ith subject, and ti is a vector of attributes for this subject. The value of(More)
The (modified) Newton method is adapted to optimize generalized cross validation (GCV) and generalized maximum likelihood (GML) scores with multiple smoothing parameters. The main concerns in solving the optimization problem are the speed and the reliability of the algorithm, as well as the invariance of the algorithm under transformations under which the(More)
We propose the randomized Generalized Approximate Cross Validation (ranGACV) method for choosing multiple smoothing parameters in penalized likelihood estimates for Bernoulli data. The method is intended for application with penalized likelihood smoothing spline ANOVA models. In addition we propose a class of approximate numerical methods for solving the(More)