#### Filter Results:

- Full text PDF available (63)

#### Publication Year

1990

2017

- This year (1)
- Last 5 years (16)
- Last 10 years (27)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- John Shawe-Taylor, Peter L. Bartlett, Robert C. Williamson, Martin Anthony
- IEEE Trans. Information Theory
- 1998

The paper introduces some generalizations of Vapnik's method of structural risk min-imisation (SRM). As well as making explicit some of the details on SRM, it provides a result that allows one to trade off errors on the training sample against improved generalization performance. It then considers the more general case when the hierarchy of classes is… (More)

- Martin Anthony, John Shawe-Taylor
- Discrete Applied Mathematics
- 1993

A new proof of a result due to Vapnik is given. Its implications for the theory of PAC learnability are discussed, with particular reference to the learnability of functions taking values in a countable set. An application to the theory of artificial neural networks is then given.

Some recent work [7, 14, 15] in computational learning theory has discussed learning in situations where the teacher is helpful, and can choose to present carefully chosen sequences of labelled examples to the learner. We say a function <italic>t</italic> in a set <italic>H</italic> of functions (a hypothesis space) defined on a set <italic>X</italic> is… (More)

- Martin Anthony, Graham R. Brightwell, Colin Cooper
- Discrete Mathematics
- 1995

In this paper we investigate a parameter defined for any graph, known as the Vapnik-Chervonenkis dimension (or VC dimension). For any vertex x of a graph G, the closed neighbourhood N (x) of x is the set of all vertices of G adjacent to x, together with x. We say that a set D of vertices of G is shattered if every subset R of D can be realised as R = D ∩ N… (More)

- Martin Anthony, Sean B. Holden
- Complex Systems
- 1994

Abst r act. Th e Vapn ik-Chervonenkis dimension has proven to be of great use in th e theoret ical study of generalizat ion in artificial neural networks. Th e "probably approximately correct" learning framework is described and the import ance of t he Vapnik-Chervonenkis dimension is illustrated. We then investigate the Vapnik-Chervonenkis dimension of… (More)

- Martin Anthony
- 2002

This paper surveys certain developments in the use of probabilistic techniques for the modelling of generalization in machine learning. Building on 'uniform convergence' results in probability theory, a number of approaches to the problem of quantifying generalization have been developed in recent years. Initially these models addressed binary… (More)

The paper introduces a framework for studying structural risk minimisation. The model views structural risk minimisation in a PAC context. It then considers the more general case when the hierarchy of classes is chosen in response to the data. This theoretically explains the impressive performance of the maximal margin hyperplane algorithm of Vapnik. It may… (More)

- Martin Anthony, Graham R. Brightwell, John Shawe-Taylor
- Discrete Applied Mathematics
- 1995

We say a function t in a set H of {0, 1}-valued functions defined on a set X is specified by S ⊆ X if the only function in H which agrees with t on S is t itself. The specification number of t is the least cardinality of such an S. For a general finite class of functions, we show that the specification number of any function in the class is at least equal… (More)

- Martin Anthony
- Discrete Applied Mathematics
- 1995

Linear threshold functions (for real and Boolean inputs) have received much attention, for they are the component parts of many artificial neural networks. Linear threshold functions are exactly those functions such that the positive and negative examples are separated by a hyperplane. One extension of this notion is to allow separators to be surfaces whose… (More)