The paper introduces some generalizations of Vapnikâ€™s method of structural risk minimisation (SRM). As well as making explicit some of the details on SRM, it provides a result that allows one toâ€¦ (More)

We say a function t in a set H of {0, 1}-valued functions defined on a set X is specified by S âŠ† X if the only function in H which agrees with t on S is t itself. The specification number of t is theâ€¦ (More)

Abst ract . Th e Vapn ik-Chervonenkis dimension has proven to be of great use in the theoret ical study of generalizat ion in artificial neural networks. Th e "probably approximately correct"â€¦ (More)

Some recent work [7, 14, 15] in computational learning theory has discussed learning in situations where the teacher is helpful, and can choose to present carefully chosen sequences of labelledâ€¦ (More)

A proof that a concept is learnable provided the Vapnik-Chervonenkis dimension is finite is given. The proof is more explicit than previous proofs and introduces two new parameters which allow boundsâ€¦ (More)

We propose a way of measuring the similarity of a Boolean vector to a given set of Boolean vectors, motivated in part by certain data mining or machine learning problems. We relate the similarityâ€¦ (More)

This paper surveys certain developments in the use of probabilistic techniques for the modelling of generalization in machine learning. Building on â€˜uniform convergenceâ€™ results in probabilityâ€¦ (More)

ence exists between parametric and nonparametric statistical tests. Parametric tests are only valid if the data satisfy certain assumptions. If these assumptions hold, they will, however, typicallyâ€¦ (More)