Guided Self-Organization of Input-Driven Recurrent Neural Networks

@article{Obst2013GuidedSO,
  title={Guided Self-Organization of Input-Driven Recurrent Neural Networks},
  author={Oliver Obst and Joschka Boedecker},
  journal={ArXiv},
  year={2013},
  volume={abs/1309.1524}
}
To understand the world around us, our brains solve a variety of tasks. One of the crucial functions of a brain is to make predictions of what will happen next, or in the near future. This ability helps us to anticipate upcoming events and plan our reactions to them in advance. To make these predictions, past information needs to be stored, transformed or used otherwise. How exactly the brain achieves this information processing is far from clear and under heavy investigation. To guide this… 

Opening the Black Box: Low-Dimensional Dynamics in High-Dimensional Recurrent Neural Networks

The hypothesis that fixed points, both stable and unstable, and the linearized dynamics around them, can reveal crucial aspects of how RNNs implement their computations is explored.

Determination of the Edge of Criticality in Echo State Networks Through Fisher Information Maximization

This paper takes advantage of a recently developed nonparametric estimator of the Fisher information matrix and provides a method to determine the critical region of echo state networks (ESNs), a particular class of recurrent networks.

Grand Challenges for Computational Intelligence

The expansive research field of computational intelligence combines various nature-inspired computational methodologies and draws on rigorous quantitative approaches across computer science, mathematics, physics, and life sciences, and some of its research topics are traditional to computational intelligence.

Critical echo state network dynamics by means of Fisher information maximization

This paper shows how to identify optimal ESN hyperparameters by relying only on the Fisher information matrix (FIM) estimated from the activations of hidden neurons, and adopts a recently proposed non-parametric FIM estimator.

University of Birmingham Dynamical systems as temporal feature spaces

A framework for rigorous analysis of feature representations imposed by dynamic kernels and it is demonstrated that for dynamic kernel associated with cycle reservoir topology, the kernel richness undergoes a phase transition close to the edge of stability.

Dynamical Systems as Temporal Feature Spaces

  • P. Tiňo
  • Computer Science
    J. Mach. Learn. Res.
  • 2020
A framework for rigorous analysis of feature representations imposed by dynamic kernels and it is demonstrated that for dynamic kernel associated with cycle reservoir topology, the kernel richness undergoes a phase transition close to the edge of stability.

Input-Anticipating Critical Reservoirs Show Power Law Forgetting of Unexpected Input Events

This letter investigates under which circumstances echo state networks can show a power law forgetting, which means traces of earlier events can be found in the reservoir for very long time spans.

L G ] 2 4 D ec 2 01 9 Dynamical Systems as Temporal Feature Spaces

This work quantifies richness of feature representations imposed by dynamic kernels and demonstrates that for dynamic kernel associated with cycle reservoir topology, the kernel richness undergoes a phase transition close to the edge of stability.

University of Birmingham Asymptotic Fisher Memory of Randomized Linear Symmetric Echo State Networks

We study asymptotic properties of Fisher memory of linear Echo State Networks with randomized symmetric state space coupling. In particular, two reservoir constructions are considered: (1) More

References

SHOWING 1-10 OF 64 REFERENCES

SORN: A Self-Organizing Recurrent Neural Network

This work introduces SORN, a self-organizing recurrent network that combines three distinct forms of local plasticity to learn spatio-temporal patterns in its input while maintaining its dynamics in a healthy regime suitable for learning.

Improving reservoirs using intrinsic plasticity

Initialization and self‐organized optimization of recurrent neural network connectivity

A general network initialization method using permutation matrices is studied and a new unsupervised learning rule based on intrinsic plasticity (IP) is derived to improve network performance in a self‐organized way.

Opening the Black Box: Low-Dimensional Dynamics in High-Dimensional Recurrent Neural Networks

The hypothesis that fixed points, both stable and unstable, and the linearized dynamics around them, can reveal crucial aspects of how RNNs implement their computations is explored.

On active information storage in input-driven systems

Using the proposed input-corrected information storage, the aim is to better quantify system behaviour, which will be important for heavily input-driven systems like artificial neural networks to abstract from specific benchmarks, or for brain networks, where intervention is difficult, individual components cannot be tested in isolation or with arbitrary input data.

Artificial Neural Networks: Biological Inspirations - ICANN 2005, 15th International Conference, Warsaw, Poland, September 11-15, 2005, Proceedings, Part I

The Computational Model to Simulate the Progress of Perceiving Patterns in Neuron Population and the Development of Cognitive Powers in Embodied Systems is presented.

Information Processing Capacity of Dynamical Systems

The theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis to define the computational capacity of a dynamical system.

Memory traces in dynamical systems

The Fisher Memory Curve is introduced as a measure of the signal-to-noise ratio (SNR) embedded in the dynamical state relative to the input SNR and it is illustrated the generality of the theory by showing that memory in fluid systems can be sustained by transient nonnormal amplification due to convective instability or the onset of turbulence.

Improving Recurrent Neural Network Performance Using Transfer Entropy

This work presents an approach to improve the hidden layer of recurrent neural networks, guided by the learning goal of the system, and shows that this reservoir adaptation improves the performance of offline echo state learning and Recursive Least Squares Online Learning.

Reservoir computing approaches to recurrent neural network training

...