Richard E. L. Metzler

Learn More
Several scenarios of interacting neural networks which are trained either in an identical or in a competitive way are solved analytically. In the case of identical training each perceptron receives the output of its neighbor. The symmetry of the stationary state as well as the sensitivity to the used training algorithm are investigated. Two competitive(More)
Complex bit sequences generated by a perceptron that learns the opposite of its own prediction are studied, and the success of a student perceptron trained on this sequence is calculated. A system of interacting perceptrons with a directed flow of information is solved analytically. A symmetry breaking phase transition is found with increasing learning(More)
A perceptron that "learns" the opposite of its own output is used to generate a time series. We analyze properties of the weight vector and the generated sequence, such as the cycle length and the probability distribution of generated sequences. A remarkable suppression of the autocorrelation function is explained, and connections to the Bernasconi model(More)
In a recent publication [Phys. Rev. E 63, 047201 (2001)], Ausloos and Ivanova report power-law probability distributions, fractal properties, and antipersistent long-range correlations in the southern oscillation index. As a comparison with artificial short-range correlated data shows, most of these findings are possibly due to misleading interpretation of(More)
  • 1