SORN: A Self-Organizing Recurrent Neural Network

@article{Lazar2009SORNAS,
  title={SORN: A Self-Organizing Recurrent Neural Network},
  author={Andreea Lazar and Gordon Pipa and Jochen Triesch},
  journal={Frontiers in Computational Neuroscience},
  year={2009},
  volume={3}
}
Understanding the dynamics of recurrent neural networks is crucial for explaining how the brain processes information. In the neocortex, a range of different plasticity mechanisms are shaping recurrent networks into effective information processing circuits that learn appropriate representations for time-varying sensory stimuli. However, it has been difficult to mimic these abilities in artificial neural network models. Here we introduce SORN, a self-organizing recurrent network. It combines… Expand
sorn: A Python package for Self Organizing Recurrent Neural Network
The self-organizing recurrent neural (SORN) network is a class of neuro-inspired artificial networks. This class of networks has been shown to mimic the ability of neocortical circuits to learn andExpand
Emerging Bayesian Priors in a Self-Organizing Recurrent Network
TLDR
The role of local plasticity rules in learning statistical priors in a self-organizing recurrent neural network (SORN) is explored and a novel connection between low level learning mechanisms and high level concepts of statistical inference is suggested. Expand
Reservoir computing with self-organizing neural oscillators
Reservoir computing is a powerful computational framework that is particularly successful in time-series prediction tasks. It utilises a brain-inspired recurrent neural network and allowsExpand
Nonlinear Dynamics Analysis of a Self-Organizing Recurrent Neural Network: Chaos Waning
TLDR
It is found that the network dynamics, characterized by an estimate of the maximum Lyapunov exponent, becomes less chaotic during its self-organization, developing into a regime where only few perturbations become amplified and may become amplified only after a substantial delay, a phenomenon proposed to call deferred chaos. Expand
Computational modeling of spiking neural network with learning rules from STDP and intrinsic plasticity
TLDR
This study shows that L SM with STDP+IP performs better than LSM with a random SNN or SNN obtained by STDP alone, and gives insights to the optimization of computational models of spiking neural networks with neural plasticity. Expand
Memory formation and recall in recurrent spiking neural networks
TLDR
By combining multiple forms of plasticity with distinct roles, a recurrently connected spiking network model self-organizes to distinguish and extract multiple overlapping external stimuli and shows that the acquired network structures remain stable over hours while plasticity is active. Expand
Self-organization of microcircuits in networks of neurons with plastic synapses
TLDR
This work provides a consistent theoretical framework for studying how spiking activity in recurrent networks interacts with synaptic plasticity to determine network structure, and explores how the form of the plasticity rule drives the evolution of microcircuits in cortical networks. Expand
Self-Organization of Microcircuits in Networks of Spiking Neurons with Plastic Synapses
TLDR
This work provides a consistent theoretical framework for studying how spiking activity in recurrent networks interacts with synaptic plasticity to determine network structure and explores how the form of the plasticity rule drives the evolution of microcircuits in cortical networks. Expand
Temporal sequence recognition in a self-organizing recurrent network
TLDR
Preliminary results show that the SORN model is able to classify well temporal sequences with symbols using these encoding methods and the advantages of this network over a static network in a classification task is still retained. Expand
Optimal Information Representation and Criticality in an Adaptive Sensory Recurrent Neuronal Network
TLDR
Several mechanisms by which the pattern of interactions can be driven into this supercritical regime are discussed and relate them to various neurological and neuropsychiatric phenomena. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 41 REFERENCES
Self-organization using synaptic plasticity
TLDR
This work shows how a network of spiking neurons is able to self-organize towards a critical state for which the range of possible inter-spike-intervals (dynamic range) is maximized. Expand
Fading memory and time series prediction in recurrent networks with different forms of plasticity
TLDR
It is demonstrated that the combination of STDP and IP shapes the network structure and dynamics in ways that allow the discovery of patterns in input time series and lead to good performance in time series prediction. Expand
Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning
  • J. Steil
  • Computer Science, Medicine
  • Neural Networks
  • 2007
TLDR
It is shown experimentally that a biologically motivated learning rule based on neural intrinsic plasticity can drive the neurons' output activities to approximate exponential distributions and implement sparse codes in the reservoir. Expand
Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
TLDR
A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. Expand
Improving reservoirs using intrinsic plasticity
TLDR
It is clearly demonstrate that IP is able to make reservoir computing more robust: the internal dynamics can autonomously tune themselves-irrespective of initial weights or input scaling-to the dynamic regime which is optimal for a given task. Expand
Computational significance of transient dynamics in cortical networks
TLDR
It is argued that there are many situations in which the transient neural behaviour, while hopping between different attractor states or moving along ‘attractor ruins’, carries most of the computational and/or behavioural significance, rather than the attractorStates eventually reached. Expand
Neural networks and physical systems with emergent collective computational abilities.
  • J. Hopfield
  • Computer Science, Medicine
  • Proceedings of the National Academy of Sciences of the United States of America
  • 1982
TLDR
A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size. Expand
The other side of the engram: experience-driven changes in neuronal intrinsic excitability
TLDR
The evidence for persistent changes in intrinsic neuronal excitability — what the authors will call intrinsic plasticity — that is produced by training in behaving animals and by artificial patterns of activation in brain slices and neuronal cultures is considered. Expand
Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
TLDR
It is shown that only near the critical boundary can recurrent networks of threshold gates perform complex computations on time series, which strongly supports conjectures that dynamical systems that are capable of doing complex computational tasks should operate near the edge of chaos. Expand
Activity-dependent scaling of quantal amplitude in neocortical neurons
TLDR
A new form of synaptic plasticity is described that increases or decreases the strength of all of a neuron's synaptic inputs as a function of activity, and may help to ensure that firing rates do not become saturated during developmental changes in the number and strength of synaptic inputs. Expand
...
1
2
3
4
5
...