Learn More
We present an efficient generalization of the sparse pseudo-input Gaussian process (SPGP) model developed by Snelson and Ghahramani [1], applying it to binary classification problems. By taking advantage of the SPGP prior covari-ance structure, we derive a numerically stable algorithm with O(N M 2) training complexity—asymptotically the same as related(More)
Dense surface models can be used to analyze 3D facial morphology by establishing a correspondence of thousands of points across each 3D face image. The models provide dramatic visualizations of 3D face-shape variation with potential for training physicians to recognize the key components of particular syndromes. We demonstrate their use to visualize and(More)
We show that the support vector machine (SVM) classification algorithm, a recent development from the machine learning community, proves its potential for structure-activity relationship analysis. In a benchmark test, the SVM is compared to several machine learning techniques currently used in the field. The classification task involves predicting the(More)
The ability of connectionist networks to generalize is often cited as one of their most important properties. We analyze the generalization ability of the class of generalized single-layer networks (GSLNs), which includes Volterra networks, radial basis function networks, regularization networks, and the modified Kanerva model, using techniques based on the(More)
This article addresses the question of whether some recent Vapnik-Chervonenkis (VC) dimension-based bounds on sample complexity can be regarded as a practical design tool. Specifically, we are interested in bounds on the sample complexity for the problem of training a pattern classifier such that we can expect it to perform valid generalization. Early(More)
This paper concerns the use of real-valued functions for binary classification problems. Previous work in this area has concentrated on using as an error estimate the 'resubstitution' error (that is, the empirical error of a classifier on the training sample) or its derivatives. However, in practice, cross-validation and related techniques are more popular.(More)
The application of statistical physics to the study of the learning curves of feedforward connectionist networks has, to date, been concerned mostly with networks that do not include hidden layers. Recent work has extended the theory to networks such as committee machines and parity machines; however these are not networks that are often used in practice(More)
When designing a pattern classifier it is often the case that we have available a supervised learning technique and a collection of training data, and we would like to gain some idea of what the error probability of our classifier will be after training. A popular way of approaching this problem in practice is to use some form of error estimate, one of the(More)