BindsNET: A Machine Learning-Oriented Spiking Neural Networks Library in Python

@article{Hazan2018BindsNETAM,
  title={BindsNET: A Machine Learning-Oriented Spiking Neural Networks Library in Python},
  author={Hananel Hazan and Daniel J. Saunders and Hassaan Khan and Devdhar Patel and Darpan T. Sanghavi and Hava T. Siegelmann and Robert Thijs Kozma},
  journal={Frontiers in Neuroinformatics},
  year={2018},
  volume={12}
}
The development of spiking neural network simulation software is a critical component enabling the modeling of neural systems and the development of biologically inspired algorithms. [] Key Method BindsNET is built on the PyTorch deep neural networks library, facilitating the implementation of spiking neural networks on fast CPU and GPU computational platforms. Moreover, the BindsNET framework can be adjusted to utilize other existing computing and hardware backends; e.g., TensorFlow and SpiNNaker.
Exploring the Connection Between Binary and Spiking Neural Networks
TLDR
It is shown that training Spiking Neural Networks in the extreme quantization regime results in near full precision accuracies on large-scale datasets like CIFAR-100 and ImageNet.
Spiking Neural Networks and Their Applications: A Review
TLDR
A comprehensive review of theories of biological neurons, which have been studied in neuroscience, is given and existing spiking neural network applications in computer vision and robotics domains are covered.
Direct Training for Spiking Neural Networks: Faster, Larger, Better
TLDR
This work proposes a neuron normalization technique to adjust the neural selectivity and develops a direct learning algorithm for deep SNNs and presents a Pytorch-based implementation method towards the training of large-scale Snns.
Neuromorphic Processing and Sensing: Evolutionary Progression of AI to Spiking
TLDR
The theoretical workings of neuromorphic technologies based on spikes are explained, the state-of-art in hardware processors, software platforms and neuromorphic sensing devices are overviewed, and a progression path is paved for current machine learning specialists to update their skillset.
Minibatch Processing in Spiking Neural Networks
TLDR
To the knowledge, this is the first general-purpose implementation of mini-batch processing in a spiking neural networks simulator, which works with arbitrary neuron and synapse models and shows the effectiveness of large batch sizes in two SNN application domains.
BioLCNet: Reward-modulated Locally Connected Spiking Neural Networks
TLDR
This work proposes a reward-modulated locally connected spiking neural network, BioLCNet, for visual learning tasks and assesses the robustness of the rewarding mechanism to varying target responses in a classical conditioning experiment.
SPAIC: A Spike-based Artificial Intelligence Computing Framework
TLDR
A Python based spiking neural network (SNN) simulation and training framework, aka SPAIC, that aims to support brain-inspired model and algorithm researches integrated with features from both deep learning and neuroscience.
PymoNNto: A Flexible Modular Toolbox for Designing Brain-Inspired Neural Networks
TLDR
The Python Modular Neural Network Toolbox (PymoNNto) provides a versatile and adaptable Python-based framework to develop and investigate brain-inspired neural networks and comes with convenient high level behaviour modules, allowing differential equation-based implementations similar to Brian2.
Learning from Sparse and Delayed Rewards with a Multilayer Spiking Neural Network
TLDR
The proposed architecture has four distinct layers and addresses the limitation of previous models in terms of scalability with input dimensions, and outperforms Q-learning on a task with six-dimensional observation space.
...
...

References

SHOWING 1-10 OF 73 REFERENCES
Brian2GeNN: a system for accelerating a large variety of spiking neural networks with graphics hardware
TLDR
A new software package, Brian2GeNN, is introduced that connects the two systems so that users can make use of GeNN GPU acceleration when developing their models in Brian, without requiring any technical knowledge about GPUs, C++ or GeNN.
Spiking Deep Networks with LIF Neurons
TLDR
This work demonstrates that biologically-plausible spiking LIF neurons can be integrated into deep networks can perform as well as other spiking models (e.g. integrate-and-fire), and provides new methods for training deep networks to run on neuromorphic hardware.
Gradient Descent for Spiking Neural Networks
TLDR
A gradient descent method for optimizing spiking network models by introducing a differentiable formulation of spiking networks and deriving the exact gradient calculation offers a general purpose supervised learning algorithm for spiking neural networks, thus advancing further investigations on spike-based computation.
CARLsim 3: A user-friendly and highly optimized library for the creation of neurobiologically detailed spiking neural networks
TLDR
CARLsim 3, a user-friendly, GPU-accelerated SNN library written in C/C++ that is capable of simulating biologically detailed neural models, is developed to allow the user to easily analyze simulation data, explore synaptic plasticity rules, and automate parameter tuning.
NeMo: A Platform for Neural Modelling of Spiking Neurons Using GPUs
TLDR
NeMo is presented, a platform for real-time spiking neural networks simulations which achieves high performance through the use of highly parallel commodity hardware in the form of graphics processing units (GPUs).
Deep Spiking Networks
TLDR
It is shown that the spiking Multi-Layer Perceptron behaves identically, during both prediction and training, to a conventional deep network of rectified-linear units, in the limiting case where the network is run for a long time.
Training Deep Spiking Neural Networks Using Backpropagation
TLDR
A novel technique is introduced, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise, which enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membranes potentials.
ANNarchy: a code generation approach to neural simulations on parallel hardware
TLDR
The ANNarchy (Artificial Neural Networks architect) neural simulator is presented, which allows to easily define and simulate rate-coded and spiking networks, as well as combinations of both, and is compared to existing solutions.
Networks of spiking neurons: the third generation of neural network models
  • W. Maas
  • Biology, Computer Science
  • 1997
Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing
TLDR
The method for converting an ANN into an SNN enables low-latency classification with high accuracies already after the first output spike, and compared with previous SNN approaches it yields improved performance without increased training time.
...
...