Angus Macintyre

Learn More
We introduce a new method for proving explicit upper bounds on the VC Dimension of general functional basis networks, and prove as an application, for the rst time, that the VC Dimension of analog neural networks with the sigmoidal activation function (y) = 1=1+e ?y is bounded by a quadratic polynomial O((lm) 2) in both the number l of programmable(More)
Using relativizations of results of Goncharov and Peretyat'kin on decidable homogeneous models, we prove that if M is S-saturated for some Scott set S, and F is an enumeration of S, then M has a presentation recursive in F. Applying this result we are able to classify degrees coding (i) the reducts of models of PA to addition or multiplication, (ii)(More)
Techniques from diierential topology are used to give polynomial bounds for the VC-dimension of sigmoidal neural networks. The bounds are quadratic in w, the dimension of the space of weights. Similar results are obtained for a wide class of Pfaaan activation functions. The obstruction (in diierential topology) to improving the bound to an optimal bound O(More)