Edge of chaos and prediction of computational performance for neural circuit models

@article{Legenstein2007EdgeOC,
  title={Edge of chaos and prediction of computational performance for neural circuit models},
  author={R. Legenstein and W. Maass},
  journal={Neural networks : the official journal of the International Neural Network Society},
  year={2007},
  volume={20 3},
  pages={
          323-34
        }
}
  • R. Legenstein, W. Maass
  • Published 2007
  • Computer Science, Medicine
  • Neural networks : the official journal of the International Neural Network Society
We analyze in this article the significance of the edge of chaos for real-time computations in neural microcircuit models consisting of spiking neurons and dynamic synapses. We find that the edge of chaos predicts quite well those values of circuit parameters that yield maximal computational performance. But obviously it makes no prediction of their computational performance for other parameter values. Therefore, we propose a new method for predicting the computational performance of neural… Expand
On Computational Power and the Order-Chaos Phase Transition in Reservoir Computing
TLDR
This article analyses based amongst others on the Lyapunov exponent reveal that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, which explains the observed decreased computational performance ofbinary circuits of high node in-degree. Expand
Biological modelling of a computational spiking neural network with neuronal avalanches
  • Xiumin Li, Qing Chen, Fangzheng Xue
  • Computer Science, Medicine
  • Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
  • 2017
TLDR
The network is found to show the best computational performance when it is subjected to critical dynamic states and the active-neuron-dominant structure refined from synaptic learning can remarkably enhance the robustness of the critical state and further improve computational accuracy. Expand
Quantitative Analysis of Dynamical Complexity in Cultured Neuronal Network Models for Reservoir Computing Applications
TLDR
Modular networks of integrate-and-fire neurons are constructed and the effect of modular structure and excitatory-inhibitory neuron ratio on network dynamics is investigated, revealing a fundamental aspect of reservoir performance in brain networks, contributing to the design of bio-inspired reservoir computing systems. Expand
Liquid computing of spiking neural network with multi-clustered and active-neuron-dominant structure
Abstract Liquid computing is an effective approach to intelligent computations of neural networks, especially for spiking neural networks. If the liquid network is embedded with a proper structure itExpand
Analysis of the dynamics of a nonlinear neuron model
TLDR
The analytical investigation strongly indicates that the dynamics of the NDS model allow a diverse dynamic behaviors such as Unstable Periodic Orbits (UPOs), which can be steadied and controlled, which then can be used in processing information tasks. Expand
Criticality predicts maximum irregularity in recurrent networks of excitatory nodes
TLDR
It is shown that irregular spiking naturally emerges in a recurrent network operating at criticality, and proposed new hallmarks of criticality at single-unit level, which could be applicable to any network of excitable nodes. Expand
Emergence of complex computational structures from chaotic neural networks through reward-modulated Hebbian learning.
TLDR
The results suggest that reward-modulated synaptic plasticity can not only optimize the network parameters for specific computational tasks, but also initiate a functional rewiring that re-programs microcircuit, thereby generating diverse computational functions in different generic cortical microcircuits. Expand
Robust transformations of firing patterns for neural networks
TLDR
A persistent paradigm of structural transitions that such networks undergo, as the overall connectivity strength is varied over its biologically meaningful range, is revealed, which suggests that not only non-coincidence of criticality, but also the persistent model of network structural changes in function of the Overall connectivity strength, could be generic features of a large class of biological neural networks. Expand
Monitor-Based Spiking Recurrent Network for the Representation of Complex Dynamic Patterns
TLDR
The novel spiking system, Monitor-based Spiking Recurrent network (MbSRN), is derived to learn and represent patterns in this paper, which provides a computational framework for memorizing the targets using a simple dynamic model that maintains biological plasticity. Expand
Connectivity, Dynamics, and Memory in Reservoir Computing with Binary and Analog Neurons
TLDR
Investigating the influence of the network connectivity (parameterized by the neuron in-degree) on a family of network models that interpolates between analog and binary networks reveals that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, leading to decreased computational performance observed in binary circuits that are densely connected. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 29 REFERENCES
Methods for Estimating the Computational Power and Generalization Capability of Neural Microcircuits
What makes a neural microcircuit computationally powerful? Or more precisely, which measurable quantities could explain why one microcircuit C is better suited for a particular family ofExpand
On the computational power of circuits of spiking neurons
TLDR
This article begins a rigorous mathematical analysis of the real-time computing capabilities of a new generation of models for neural computation, liquid state machines, that can be implemented with--in fact benefit from--diverse computational units. Expand
A statistical analysis of information-processing properties of lamina-specific cortical microcircuit models.
TLDR
It is concluded that computer simulations of detailed lamina-specific cortical microcircuit models provide new insight into computational consequences of anatomical and physiological data. Expand
Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
TLDR
It is shown that only near the critical boundary can recurrent networks of threshold gates perform complex computations on time series, which strongly supports conjectures that dynamical systems that are capable of doing complex computational tasks should operate near the edge of chaos. Expand
Fading memory and kernel properties of generic cortical microcircuit models
TLDR
This article proposes to analyze circuits of spiking neurons in terms of their roles as analog fading memory and non-linear kernels, rather than as implementations of specific computational operations and algorithms. Expand
Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
TLDR
A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. Expand
What makes a dynamical system computationally powerful ?
We review methods for estimating the computational capability of a complex dynamical system. The main examples that we discuss are models for cortical neural microcircuits with varying degrees ofExpand
Signal buffering in random networks of spiking neurons: microscopic versus macroscopic phenomena.
  • J. Mayor, W. Gerstner
  • Physics, Biology
  • Physical review. E, Statistical, nonlinear, and soft matter physics
  • 2005
TLDR
The signal buffering properties in simulated networks are studied as a function of the networks' state, characterized by both the Lyapunov exponent of the microscopic dynamics and the macroscopic activity derived from mean-field theory. Expand
Computation at the edge of chaos: phase transitions and emergent computation
TLDR
There is a fundamental connection between computation and phase transitions, especially second-order or “critical” transitions, and some of the implications for the understanding of nature if such a connection is borne out are discussed. Expand
Revisiting the Edge of Chaos: Evolving Cellular Automata to Perform Computations
TLDR
An experiment similar to one performed by Packard (1988), in which a genetic algorithm is used to evolve cellular automata to perform a particular computational task, demonstrates how symmetry breaking can impede the evolution toward higher computational capability. Expand
...
1
2
3
...