Skip to search formSkip to main content

You are currently offline. Some features of the site may not work correctly.

Semantic Scholar uses AI to extract papers important to this topic.

Highly Cited

2014

Highly Cited

2014

Machine learning is one of the fastest growing areas of computer science, with far-reaching applications. The aim of this… Expand

Is this relevant?

2008

2008

For the purposes of this lecture, we restrict ourselves to the binary supervised batch learning setting. We assume that we have… Expand

Is this relevant?

Highly Cited

2004

Highly Cited

2004

We present a mechanism to train support vector machines (SVMs) with a hybrid kernel and minimal Vapnik-Chervonenkis (VC… Expand

Is this relevant?

Highly Cited

2002

Highly Cited

2002

Preface These notes have served as a basis for a course in Pisa in Spring 1999. A parallel course on the construction of o… Expand

Is this relevant?

Highly Cited

1999

Highly Cited

1999

From the Publisher:
This monograph provides a through and coherent introduction to the mathematical properties of feedforward… Expand

Is this relevant?

Highly Cited

1999

Highly Cited

1999

This thesis examines so-called folding neural networks as a mechanism for machine learning. Folding networks form a… Expand

Is this relevant?

Highly Cited

1998

Highly Cited

1998

1. Some elementary results 2. Semialgebraic sets 3. Cell decomposition 4. Definable invariants: Dimension and Euler… Expand

Is this relevant?

1998

1998

Bayesian algorithms for Neural Networks are known to produce classiiers which are very resistent to overrt-ting. It is often… Expand

Is this relevant?

Highly Cited

1995

Highly Cited

1995

Let V contained in {0,1}^n have Vapnk-Chervonenkis dimension d. Let M(k/n,V) denote the cardinality of the largest W contained in… Expand

Is this relevant?

Highly Cited

1992

Highly Cited

1992

Abstract This paper deals with single-hidden-layer feedforward nets, studying various aspects of classification power and… Expand

Is this relevant?