Héctor J. Sussmann

Learn More
|{ We show that, for feedforward nets with a single hidden layer, a single output node, and a \transfer function" Tanhs, the net is uniquely determined by its input-output map, up to an obvious nite group of symmetries (permutations of the hidden nodes, and changing the sign of all the weights associated to a particular hidden node), provided that the net(More)
In the theory of finite dimensional linear systems, it is well known that every input-output map that can be realized by one such system can also be realized by a system which is “minimal”, i.e. both controllable and observable. Moreover, the minimal realization of a given map is unique up to isomorphism. It is shown here that similar results hold for the(More)
We illustrate the use of the techniques of modern geometric optimal control theory by studying the shortest paths for a model of a car that can move forwards and backwards. This problem was discussed in recent work by Reeds and Shepp who showed, by special methods, (a) that shortest path motion could always be achieved by means of trajectories of a special(More)
We propose a deenition of \regular synthesis," more general than those suggested by other authors such as Boltyanskii and Brunovsk y, and an even more general notion of \regular presynthesis." We give a complete proof of the corresponding suuciency theorem, a slightly weaker version of which had been stated in an earlier article, with only a rough outline(More)
A universal input is an inputu with the property that, whenever two states give rise to a different output for some input, then they give rise to a different output foru. For an observable system,u is universal if the initial state can be reconstructed from the knowledge of the output foru. It is shown that, for continuous-time analytic systems, analytic(More)
We give an example of a neural net without hidden layers and with a sigmoid transfer function, together with a training set of binary vectors, for which the sum of the squared errors, regarded as a function of the weights, has a local minimum which is not a global minimum. The example consists of a set of 125 training instances, with four weights and a(More)
This is the first of two papers devoted to recent ideas on the theory of generalized differentials with good open mapping properties. Here we will discuss “generalized differentiation theories” (abbr. GDTs), with special emphasis on the series of developments initiated by Jack Warga’s pioneering work on derivate containers. In the second paper, we will(More)