Unsupervised Learning with Self-Organizing Spiking Neural Networks

@article{Hazan2018UnsupervisedLW,
  title={Unsupervised Learning with Self-Organizing Spiking Neural Networks},
  author={Hananel Hazan and Daniel J. Saunders and Darpan T. Sanghavi and Hava T. Siegelmann and Robert Thijs Kozma},
  journal={2018 International Joint Conference on Neural Networks (IJCNN)},
  year={2018},
  pages={1-6}
}
We present a system comprising a hybridization of self-organized map (SOM) properties with spiking neural networks (SNNs) that retain many of the features of SOMs. [] Key Method We develop and test various inhibition strategies, such as growing with inter-neuron distance and two distinct levels of inhibition. The quality of the unsupervised learning algorithm is evaluated using examples with known labels.

Figures and Tables from this paper

Lattice map spiking neural networks (LM-SNNs) for clustering and classifying image data
TLDR
Spiking neural networks with a lattice architecture are introduced in this work, combining several desirable properties of SNNs and self-organized maps, including a population-level confidence rating, and an n-gram inspired method.
Self-organizing neurons: toward brain-inspired unsupervised learning
TLDR
Kohonen-based Self-Organizing Maps is applied for unsupervised learning without labels, and original extensions such as the Dynamic SOM that enables continuous learning and the Pruning Cellular SOM that includes synaptic pruning in neuromorphic circuits are explored.
STDP Learning of Image Patches with Convolutional Spiking Neural Networks
TLDR
A class of convolutional spiking neural networks is introduced, trained to detect image features with an unsupervised, competitive learning mechanism, and the time and memory requirements of learning with and operating such networks are analyzed.
Fast Convergence of Competitive Spiking Neural Networks with Sample-Based Weight Initialization
TLDR
It is shown that the number of samples the CSNN needs to converge can be reduced significantly by a proposed new weight initialization, which uses input samples as initial values for the connection weights.
Spiking Neural Predictive Coding for Continual Learning from Data Streams
TLDR
The proposed Spiking Neural Coding Network is competitive in terms of classification performance, can conduct online semi-supervised learning, naturally experiences less forgetting when learning from a sequence of tasks, and is more computationally economical and biologically-plausible than popular artificial neural networks.
Spiking Inception Module for Multi-layer Unsupervised Spiking Neural Networks
TLDR
The proposed Spiking Inception (Sp-Inception) module is trained through STDP-based competitive learning and outperforms the baseline modules on learning capability, learning efficiency, and robustness, and reaches state-of-the-art results on the MNIST dataset among the existing unsupervised SNNs.
CRBA: A Competitive Rate-Based Algorithm Based on Competitive Spiking Neural Networks
TLDR
It is shown that the weights and firing thresholds learned by CRBA can be used to initialize CSNN's parameters that results in its much more efficient operation.
Research on learning mechanism designing for equilibrated bipolar spiking neural networks
TLDR
Inspired by the ancient Chinese “Yin and Yang” Theory, an ensemble learning optimized supervised learning method is designed and tailored for this SNN structure and results show that it could gain reasonable accuracy with much more compact structure and much more sparse synapse connections.
An analysis of learning performance changes in spiking neural networks(SNN)
TLDR
This paper builds a neural network by using the method of implementing artificial intelligence in the form of spiking natural networks (SNN), the next-generation of artificial intelligence research, and analyzes how the number of neurons in that neural networks affect the performance of the neural networks.
...
...

References

SHOWING 1-10 OF 18 REFERENCES
Unsupervised learning of digit recognition using spike-timing-dependent plasticity
TLDR
A SNN for digit recognition which is based on mechanisms with increased biological plausibility, i.e., conductance-based instead of current-based synapses, spike-timing-dependent plasticity with time-dependent weight change, lateral inhibition, and an adaptive spiking threshold is presented.
Identifying repeating motifs in the activation of synchronized bursts in cultured neuronal networks
CARLsim 3: A user-friendly and highly optimized library for the creation of neurobiologically detailed spiking neural networks
TLDR
CARLsim 3, a user-friendly, GPU-accelerated SNN library written in C/C++ that is capable of simulating biologically detailed neural models, is developed to allow the user to easily analyze simulation data, explore synaptic plasticity rules, and automate parameter tuning.
Spiking Neuron Models: Single Neurons, Populations, Plasticity
TLDR
A comparison of single and two-dimensional neuron models for spiking neuron models and models of Synaptic Plasticity shows that the former are superior to the latter, while the latter are better suited to population models.
Tradeoffs and Constraints on Neural Representation in Networks of Cortical Neurons
TLDR
This work compared the efficacy of different kinds of “neural codes” to represent both spatial and temporal input features in in vitro networks of rat cortical neurons, indicating the inherent redundancy in neural population activity.
Rapid Neural Coding in the Retina with Relative Spike Latencies
TLDR
It is reported that certain retinal ganglion cells encode the spatial structure of a briefly presented image in the relative timing of their first spikes, which allows the retina to rapidly and reliably transmit new spatial information with the very first spikes emitted by a neural population.
The Brian Simulator
TLDR
“Brian” is a simulator for spiking neural networks that uses vector-based computation to allow for efficient simulations, and is particularly useful for neuroscientific modelling at the systems level, and for teaching computational neuroscience.
Synaptic Modifications in Cultured Hippocampal Neurons: Dependence on Spike Timing, Synaptic Strength, and Postsynaptic Cell Type
TLDR
The results underscore the importance of precise spike timing, synaptic strength, and postsynaptic cell type in the activity-induced modification of central synapses and suggest that Hebb’s rule may need to incorporate a quantitative consideration of spike timing that reflects the narrow and asymmetric window for the induction of synaptic modification.
Models of Orientation and Ocular Dominance Columns in the Visual Cortex: A Critical Comparison
TLDR
Ten of the most prominent models of cortical map formation and structure are critically evaluated and compared with the most recent experimental findings from macaque striate cortex and several models produce orientation map patterns that are not consistent with the experimental data from macaques.
...
...