Many connectionist approaches to musical expectancy and music composition let the question of “What next?” overshadow the equally important question of “When next?”. One cannot escape the latter… (More)

This paper explores the effect of initial weight selection on feed-forward networks learning simple functions with the back-propagation technique. We first demonstrate, through the use of Monte Carlo… (More)

Several recurrent networks have been proposed as representations for the task of formal language learning. After training a recurrent network recognize a formal language or predict the next symbol of… (More)

This paper addresses the problem of algorithm discovery, via evolutionary search, in the context of matrix multiplication. The traditional multiplication algorithm requires O(n3) multiplications for… (More)

The author proves that the learning problem in connections of networks is NP-complete, i.e. no polynomial-time algorithm exists which will correctly modify connection weights of a neural network.… (More)

From the many possible perspectives in which an agent may be viewed, behavior-based AI selects observable actions as a particularly useful level of description. Yet behavior is clearly not structure,… (More)

Hybrid systems possess continuous dynamics deened within regions of state spaces and discrete transitions among the regions. Many practical control veriication and synthesis tasks can be reduced to… (More)

In back-propagation (Rumelhart et al, 1985) connection weights are used to both compute node activations and error gradients for hidden units. Grossberg (1987) has argued that the dual use of the… (More)