Lattice map spiking neural networks (LM-SNNs) for clustering and classifying image data

  title={Lattice map spiking neural networks (LM-SNNs) for clustering and classifying image data},
  author={Hananel Hazan and Daniel J. Saunders and Darpan T. Sanghavi and Hava T. Siegelmann and Robert Thijs Kozma},
  journal={Annals of Mathematics and Artificial Intelligence},
  pages={1 - 24}
Spiking neural networks (SNNs) with a lattice architecture are introduced in this work, combining several desirable properties of SNNs and self-organized maps (SOMs). Networks are trained with biologically motivated, unsupervised learning rules to obtain a self-organized grid of filters via cooperative and competitive excitatory-inhibitory interactions. Several inhibition strategies are developed and tested, such as (i) incrementally increasing inhibition level over the course of network… Expand

Paper Mentions

News Article
A Spiking Neural Architecture for Vector Quantization and Clustering
A novel spike-timing-dependent plasticity (STDP) rule able to efficiently learn first-spike latency codes is developed and implemented in a two-layer SNN architecture of leaky integrate-and-fire (LIF) neurons. Expand
FSpiNN: An Optimization Framework for Memory- and Energy-Efficient Spiking Neural Networks.
FSpiNN is an optimization framework for obtaining memory- and energy-efficient SNNs for training and inference processing, with unsupervised learning capability while maintaining accuracy, by reducing the computational requirements of neuronal and STDP operations, improving the accuracy of STDP-based learning, compressing the SNN through a fixed-point quantization, and incorporating the memory and energy requirements in the optimization process. Expand
Q-SpiNN: A Framework for Quantizing Spiking Neural Networks
The Q-SpiNN is proposed, a novel quantization framework for memory-efficient SNNs that employs quantization for different SNN parameters based on their significance to the accuracy, and develops an algorithm that quantifies the benefit of the memory-accuracy trade-off obtained by the candidates, and selects the Pareto-optimal one. Expand
FSpiNN: An Optimization Framework for Memory-Efficient and Energy-Efficient Spiking Neural Networks
FSpiNN is proposed, an optimization framework for obtaining memory-efficient and energy-efficient SNNs for training and inference processing, with unsupervised learning capability while maintaining accuracy, by reducing the computational requirements of neuronal and STDP operations, and improving the accuracy of STDP-based learning. Expand
A Spiking Neural Network Based Auto-encoder for Anomaly Detection in Streaming Data
It is shown that SNNs are well suited for detecting anomalous character sequences, that they can learn rapidly, and that there are many optimizations to the SNN architecture and training that can improve AD performance. Expand
Unsupervised Features Extracted using Winner-Take-All Mechanism Lead to Robust Image Classification
  • Devdhar Patel, R. Kozma
  • Computer Science
  • 2020 International Joint Conference on Neural Networks (IJCNN)
  • 2020
It is demonstrated that features extracted in an unsupervised manner using the biologically inspired Hebbian learning rule in a winner-take-all setting, perform competitively with BP on the image classification task. Expand
Cognition and Neurocomputation
On the Self-Repair Role of Astrocytes in STDP Enabled Unsupervised SNNs
The degree of self-repair that can be enabled in such networks with varying degree of faults ranging from 50 to 90% is characterized and the proposal is evaluated on the MNIST and Fashion-MNIST datasets. Expand


Unsupervised Learning with Self-Organizing Spiking Neural Networks
A hybridization of self-organized map properties with spiking neural networks that retain many of the features of SOMs is presented, and using the optimal choice of parameters, this approach produces improvements over state-of-art spiking Neural networks. Expand
A Spiking Self-Organizing Map Combining STDP, Oscillations, and Continuous Learning
A network of integrate-and-fire neurons is presented that incorporates solutions to each of these issues through the neuron model and network structure, thereby representing a significant step toward further understanding of the self-organizational properties of the brain. Expand
Spiking Self-organizing Maps for Classification Problem☆
Embedded spiking neurons for Kohonen's Self-organizing Maps (SOM) learning to improve its learning process are proposed and result on cancer dataset shows that the tested model has produced feasible classification accuracy with low quantization error. Expand
Self-Organization in Networks of Spiking Neurons
Traditionally artificial neural network design was based on average firing rate model of a biological neuron. In this paper we briefly review approaches based on single action potentials. Then weExpand
STDP-based spiking deep convolutional neural networks for object recognition
Previous studies have shown that spike-timing-dependent plasticity (STDP) can be used in spiking neural networks (SNN) to extract visual features of low or intermediate complexity in an unsupervisedExpand
Computation with Spikes in a Winner-Take-All Network
This work extends previous theoretical results showing that a WTA recurrent network receiving regular spike inputs can select the correct winner within one interspike interval, and uses a simplified Markov model of the spiking network to examine analytically the ability of a spike-based WTA network to discriminate the statistics of inputs ranging from stationary regular to nonstationary Poisson events. Expand
Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing
The method for converting an ANN into an SNN enables low-latency classification with high accuracies already after the first output spike, and compared with previous SNN approaches it yields improved performance without increased training time. Expand
Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification
This paper shows conversion of popular CNN architectures, including VGG-16 and Inception-v3, into SNNs that produce the best results reported to date on MNIST, CIFAR-10 and the challenging ImageNet dataset. Expand
Training Deep Spiking Neural Networks Using Backpropagation
A novel technique is introduced, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise, which enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membranes potentials. Expand
Unsupervised learning of digit recognition using spike-timing-dependent plasticity
A SNN for digit recognition which is based on mechanisms with increased biological plausibility, i.e., conductance-based instead of current-based synapses, spike-timing-dependent plasticity with time-dependent weight change, lateral inhibition, and an adaptive spiking threshold is presented. Expand