#### Filter Results:

#### Publication Year

1989

2011

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

- A C C Coolen
- 2005

It is shown how the generating functional method of De Dominicis can be used to solve the dynamics of the original version of the minority game (MG), in which agents observe real as opposed to fake market histories. Here one again finds exact closed equations for correlation and response functions, but now these are defined in terms of two connected… (More)

Present neural models of classical conditioning all suffer from the same shortcoming: local representation of information (therefore, very precise neural prewiring is necessary). As an alternative we develop two neural models of classical conditioning which rely on distributed representations of information. Both models are of the Hopfield type. In the… (More)

A simple model of coupled dynamics of fast neurons and slow interactions , modelling self-organization in recurrent neural networks, leads naturally to an effective statistical mechanics characterized by a partition function which is an average over a replicated system. This is reminiscent of the replica trick used to study spin-glasses, but with the… (More)

We generate new mathematical tools with which to quantify the macroscopic topological structure of large directed networks. This is achieved via a statistical mechanical analysis of constrained maximum entropy ensembles of directed random graphs with prescribed joint distributions for in-and out-degrees and prescribed degree–degree correlation functions. We… (More)

We study the dynamics of a simple message-passing decoder for LDGM channel coding by using the generating functional analysis (GFA). The decoder addressed here is one of the simplest examples, which is characterized by a sparse random graph with many short loops. The GFA allows us to study the dynamics of iterative systems in an exact way in the large… (More)

We describe the application of tools from statistical mechanics to analyse the dynamics of various classes of supervised learning rules in perceptrons. The character of this paper is mostly that of a cross between a biased non-encyclopedic review and lecture notes: we try to present a coherent and self-contained picture of the basics of this field, to… (More)

We derive analytical expressions for the connections of large perceptrons, by studying the xed points of the perceptron learning rule. If the training set consists of all possible input vectors, we can calculate (for large systems) the connections as a series expansion in the system size. The leading term in this expansion turns out to be either the Hebb… (More)

- H C Rae, P Sollich, A C C Coolen
- 1999

We solve the dynamics of on-line Hebbian learning in large perceptrons exactly, for the regime where the size of the training set scales linearly with the number of inputs. We consider both noiseless and noisy teachers. Our calculation cannot be extended to non-Hebbian rules, but the solution provides a convenient and welcome benchmark with which to test… (More)