Anthony C. C. Coolen

Learn More
Randomizing networks using a naive "accept-all" edge-swap algorithm is generally biased. Building on recent results for nondirected graphs, we construct an ergodic detailed balance Markov chain with nontrivial acceptance probabilities for directed graphs, which converges to a strictly uniform measure and is based on edge swaps that conserve all in and out(More)
There is currently great interest in determining physical parameters, e.g. fluorescence lifetime, of individual molecules that inform on environmental conditions, whilst avoiding the artefacts of ensemble averaging. Protein interactions, molecular dynamics and sub-species can all be studied. In a burst integrated fluorescence lifetime (BIFL) experiment,(More)
A simple model of coupled dynamics of fast neurons and slow interactions , modelling self-organization in recurrent neural networks, leads naturally to an effective statistical mechanics characterized by a partition function which is an average over a replicated system. This is reminiscent of the replica trick used to study spin-glasses, but with the(More)
We generate new mathematical tools with which to quantify the macroscopic topological structure of large directed networks. This is achieved via a statistical mechanical analysis of constrained maximum entropy ensembles of directed random graphs with prescribed joint distributions for in-and out-degrees and prescribed degree–degree correlation functions. We(More)
— We study the dynamics of a simple message-passing decoder for LDGM channel coding by using the generating functional analysis (GFA). The decoder addressed here is one of the simplest examples, which is characterized by a sparse random graph with many short loops. The GFA allows us to study the dynamics of iterative systems in an exact way in the large(More)
We present exact analytical equilibrium solutions for a class of recurrent neural network models, with both sequential and parallel neuronal dynamics, in which there is a tunable competition between nearest-neighbour and long-range synaptic interactions. This competition is found to induce novel coexistence phenomena as well as discontinuous transitions(More)
We solve the dynamics of on-line Hebbian learning in perceptrons exactly, for the regime where the size of the training set scales linearly with the number of inputs. We consider both noiseless and noisy teachers. Our calculation cannot be extended to non-Hebbian rules, but the solution provides a nice benchmark to test more general and advanced theories(More)
We describe the application of tools from statistical mechanics to analyse the dynamics of various classes of supervised learning rules in perceptrons. The character of this paper is mostly that of a cross between a biased non-encyclopedic review and lecture notes: we try to present a coherent and self-contained picture of the basics of this field, to(More)