Topological constraints and robustness in liquid state machines

@article{Hazan2012TopologicalCA,
  title={Topological constraints and robustness in liquid state machines},
  author={Hananel Hazan and L. Manevitz},
  journal={Expert Syst. Appl.},
  year={2012},
  volume={39},
  pages={1597-1606}
}
Temporal pattern recognition via temporal networks of temporal neurons
We show that real valued continuous functions can be recognized in a reliable way, with good generalization ability using an adapted version of the Liquid State Machine (LSM) that receives direct
Synchrony-Based State Representation for Classification by Liquid State Machines
  • Nicolas Pajot, M. Boukadoum
  • Computer Science
    2021 IEEE 20th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)
  • 2021
TLDR
This work proposes a model of liquid state representation that builds the feature vectors from the temporal information about the spike trains, hence using spike synchrony instead of rate, and shows that such model outperforms a rate-only model in distinguishing spike train pairs, regardless of the frequency chosen to sample the liquid state or the noise level.
Bio-Inspired Evolutionary Model of Spiking Neural Networks in Ionic Liquid Space
TLDR
The effect of topological evolution on the proposed model's performance for some classification problems is studied in this paper and classification results via separation and accuracy values have shown that the proposed ionic liquid outperforms the original LSM.
Computational Efficiency of a Modular Reservoir Network for Image Recognition
TLDR
This paper presents a large-scale bioinspired LSM with modular topology that specifically designed input synapses can fit the activation of the real cortex and perform the Hough transform, a feature extraction algorithm used in digital image processing, without additional cost.
Neurorobotic simulations on the degradation of multiple column liquid state machines
TLDR
The results show both approaches, Modular and Monolithic, had a similar behaviour, however the Modular was better at withstanding the decimation of neurons when it was concentrated in a single column.
Towards Classifying Human Phonemes without Encodings via Spatiotemporal Liquid State Machines: Extended Abstract
TLDR
This mechanism, unlike most other methods, is done without any encoding of the signal, and without changing time into space, but instead uses the Liquid State Machine paradigm which is an abstraction of natural cortical arrangements.
...
...

References

SHOWING 1-10 OF 36 REFERENCES
The Liquid State Machine is not Robust to Problems in Its Components but Topological Constraints Can Restore Robustness
TLDR
It is shown that the LSM as normally defined cannot serve as a natural model for brain function, and specifying certain kinds of topological constraints, which have been claimed are reasonably plausible biologically, can restore robustness in this sense to LSMs.
Stability and Topology in Reservoir Computing
TLDR
This work shows that, with the enforcement of topological constraints on the reservoir, in particular that of small world topology, the model is indeed fault tolerant, and implies that "natural" computational systems must have specific topologies and the uniform random connectivity is not appropriate.
Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
TLDR
A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.
Temporal integration in recurrent microcircuits
TLDR
This article surveys the primary models that have been proposed for temporal integration in neural microcircuits of the cortex for adaptive real-time response to temporally integrated information.
Computational models for generic cortical microcircuits
TLDR
A computational model that could explain the potentially universal computational capabilities and does not require a task-dependent construction of neural circuits is proposed, based on principles of high dimensional dynamical systems in combination with statistical learning theory, and can be implemented on generic evolved or found recurrent circuitry.
Modeling the process of rate selection in neuronal activity.
We present the elements of a mathematical computational model that reflects the experimental finding that the time-scale of a neuron is not fixed; but rather varies with the history of its stimulus.
Pattern Recognition in a Bucket
This paper demonstrates that the waves produced on the surface of water can be used as the medium for a “Liquid State Machine” that pre-processes inputs so allowing a simple perceptron to solve the
On the computational power of circuits of spiking neurons
Reservoir computing approaches to recurrent neural network training
The tempotron: a neuron that learns spike timing–based decisions
TLDR
This work proposes a new, biologically plausible supervised synaptic learning rule that enables neurons to efficiently learn a broad range of decision rules, even when information is embedded in the spatiotemporal structure of spike patterns rather than in mean firing rates.
...
...