Visual explanations from spiking neural networks using inter-spike intervals
@article{Kim2021VisualEF, title={Visual explanations from spiking neural networks using inter-spike intervals}, author={Youngeun Kim and Priyadarshini Panda}, journal={Scientific Reports}, year={2021}, volume={11} }
By emulating biological features in brain, Spiking Neural Networks (SNNs) offer an energy-efficient alternative to conventional deep learning. To make SNNs ubiquitous, a ‘visual explanation’ technique for analysing and explaining the internal spike behavior of such temporal deep SNNs is crucial. Explaining SNNs visually will make the network more transparent giving the end-user a tool to understand how SNNs make temporal predictions and why they make a certain decision. In this paper, we…
17 Citations
Attention Spiking Neural Networks
- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2023
This work lights up SNN’s potential as a general backbone to support various applications in the field of SNN research, with a great balance between effectiveness and efficiency.
Backpropagation with biologically plausible spatiotemporal adjustment for training deep spiking neural networks
- Computer SciencePatterns
- 2022
A Novel Explainable Out-of-Distribution Detection Approach for Spiking Neural Networks
- Computer ScienceArXiv
- 2022
This work presents a novel OoD detector that can identify whether test examples input to a Spiking Neural Network belong to the distribution of the data over which it was trained, and characterization of the internal activations of the hidden layers of the network in the form of spike count patterns lays a basis for determining when the activations induced by a test instance is atypical.
Neural Architecture Search for Spiking Neural Networks
- Computer ScienceECCV
- 2022
This paper introduces a novel Neural Architecture Search (NAS) approach for finding better SNN architectures that can represent diverse spike activation patterns across different data samples without training, and shows that SNASNet achieves state-of-the-art performance with significantly lower timesteps.
Temporal Efficient Training of Spiking Neural Network via Gradient Re-weighting
- Computer ScienceArXiv
- 2022
The temporal efficient training (TET) approach is introduced to compensate for the loss of momentum in the gradient descent with SG so that the training process can converge into flatter minima with better generalizability.
Decoding EEG With Spiking Neural Networks on Neuromorphic Hardware
- Computer Science
- 2022
An SNN architecture with an input encoding and network design that exploits the priors of spatial and temporal dependencies in the EEG signal is proposed, demonstrating the effectiveness of SNNs in accurately and reliably decoding EEG while availing the computational advantages offered by neuromorphic computing, and paves the way for employing neuromorphic methods in portable BCI systems.
Exploring Temporal Information Dynamics in Spiking Neural Networks
- Computer ScienceArXiv
- 2022
It is observed that temporal information concentration is crucial to building a robust SNN but has little effect on classification accuracy, and a loss function is designed to change the trend of temporal information.
Beyond classification: directly training spiking neural networks for semantic segmentation
- Computer ScienceNeuromorphic Computing and Engineering
- 2022
This paper investigates the SNN applications beyond classification and presents semantic segmentation networks configured with spiking neurons, and shows that SNNs can be more robust and energy-efficient compared to their ANN counterparts in this domain.
Heterogeneous Ensemble-Based Spike-Driven Few-Shot Online Learning
- Computer ScienceFrontiers in Neuroscience
- 2022
The proposed HESFOL model uses the entropy theory to establish the gradient-based few-shot learning scheme in a recurrent SNN architecture and emphasizes the application of modern entropy-based machine learning methods in state-of-the-art spike-driven learning algorithms.
Exploring Lottery Ticket Hypothesis in Spiking Neural Networks
- Computer ScienceECCV
- 2022
The proposed Early-Time (ET) ticket can be seamlessly combined with the a common pruning techniques for finding winning tickets, such as Iterative Magnitude Pruning (IMP) and Early-Bird (EB) tickets, and results show that the proposed ET ticket reduces search time by up to 38% compared to IMP or EB methods.
References
SHOWING 1-10 OF 73 REFERENCES
Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks
- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2022
The contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making is investigated, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
Training Deep Spiking Neural Networks Using Backpropagation
- Computer ScienceFront. Neurosci.
- 2016
A novel technique is introduced, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise, which enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membranes potentials.
Temporal Coding in Spiking Neural Networks with Alpha Synaptic Function
- Computer ScienceICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2020
This work proposes a spiking neural network model that encodes information in the relative timing of individual neuron spikes and performs classification using the first output neuron to spike, and successfully train the network on the MNIST dataset encoded in time.
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks
- Computer ScienceFront. Neurosci.
- 2018
A spatio-temporal backpropagation (STBP) algorithm for training high-performance SNNs is proposed, which combines the layer-by-layer spatial domain (SD) and the timing-dependent temporal domain (TD), and does not require any additional complicated skill.
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures
- Computer ScienceFrontiers in Neuroscience
- 2020
This work proposes an approximate derivative method that accounts for the leaky behavior of LIF neurons that enables training deep convolutional SNNs directly (with input spike events) using spike-based backpropagation and analyze sparse event-based computations to demonstrate the efficacy of the proposed SNN training method for inference operation in the spiking domain.
RMP-SNN: Residual Membrane Potential Neuron for Enabling Deeper High-Accuracy and Low-Latency Spiking Neural Network
- Computer Science2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2020
It is found that performance degradation in the converted SNN stems from using "hard reset" spiking neuron that is driven to fixed reset potential once its membrane potential exceeds the firing threshold, leading to information loss during SNN inference.
Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
- Computer ScienceFrontiers in Neuroscience
- 2020
Novel algorithmic techniques of modifying the SNN configuration with backward residual connections, stochastic softmax, and hybrid artificial-and-spiking neuronal activations to improve the learning ability of the training methodologies to yield competitive accuracy, while, yielding large efficiency gains over their artificial counterparts.
Unsupervised learning of digit recognition using spike-timing-dependent plasticity
- Computer ScienceFront. Comput. Neurosci.
- 2015
A SNN for digit recognition which is based on mechanisms with increased biological plausibility, i.e., conductance-based instead of current-based synapses, spike-timing-dependent plasticity with time-dependent weight change, lateral inhibition, and an adaptive spiking threshold is presented.
Recurrent Spiking Neural Network Learning Based on a Competitive Maximization of Neuronal Activity
- Computer ScienceFront. Neuroinform.
- 2018
The basic principle of the proposed algorithm is believed to be practically applicable to the construction of much more complicated and diverse task solving SNNs and is referred to as “Family-Engaged Execution and Learning of Induced Neuron Groups”, or FEELING.