Robustness from structure: Inference with hierarchical spiking networks on analog neuromorphic hardware

@article{Petrovici2017RobustnessFS,
  title={Robustness from structure: Inference with hierarchical spiking networks on analog neuromorphic hardware},
  author={Mihai A. Petrovici and Anna Schroeder and Oliver Julien Breitwieser and Andreas Gr{\"u}bl and Johannes Schemmel and Karlheinz Meier},
  journal={2017 International Joint Conference on Neural Networks (IJCNN)},
  year={2017},
  pages={2209-2216}
}
How spiking networks are able to perform probabilistic inference is an intriguing question, not only for understanding information processing in the brain, but also for transferring these computational principles to neuromorphic silicon circuits. A number of computationally powerful spiking network models have been proposed, but most of them have only been tested, under ideal conditions, in software simulations. Any implementation in an analog, physical system, be it in vivo or in silico, will… 

Figures from this paper

Accelerated Physical Emulation of Bayesian Inference in Spiking Neural Networks
TLDR
This work presents a spiking network model that performs Bayesian inference through sampling on the BrainScaleS neuromorphic platform, where it is used for generative and discriminative computations on visual data and implicitly demonstrates its robustness to various substrate-specific distortive effects.
Large-Scale Neuromorphic Spiking Array Processors: A Quest to Mimic the Brain
TLDR
Some of the most significant neuromorphic spiking emulators are described, the different architectures and approaches used by them are compared, their advantages and drawbacks are illustrated, and the capabilities that each can deliver to neural modelers are highlighted.

References

SHOWING 1-10 OF 33 REFERENCES
Neuromorphic hardware in the loop: Training a deep spiking network on the BrainScaleS wafer-scale system
TLDR
This paper demonstrates how iterative training of a hardware-emulated network can compensate for anomalies induced by the analog substrate, and shows that deep spiking networks emulated on analog neuromorphic devices can attain good computational performance despite the inherent variations of the Analog substrate.
Artificial Cognitive Systems: From VLSI Networks of Spiking Neurons to Neuromorphic Cognition
TLDR
It is shown how VLSI networks of spiking neurons with spike-based plasticity mechanisms and soft winner-take-all architectures represent important building blocks useful for implementing artificial neural systems able to exhibit basic cognitive abilities.
Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons
TLDR
A neural network model is proposed and it is shown by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time.
Stochastic inference with spiking neurons in the high-conductance state
TLDR
It is shown how an ensemble of leaky integrate-and-fire neurons with conductance-based synapses embedded in a spiking environment can attain the correct firing statistics for sampling from a well-defined target distribution and establishes a rigorous link between deterministic neuron models and functional stochastic dynamics on the network level.
Demonstrating Hybrid Learning in a Flexible Neuromorphic Hardware System
TLDR
To enable flexibility in implementable learning mechanisms while keeping high efficiency associated with neuromorphic implementations, a general-purpose processor with full-custom analog elements is presented that will enable flexible and efficient learning as a platform for neuroscientific research and technological applications.
Compensating Inhomogeneities of Neuromorphic VLSI Devices Via Short-Term Synaptic Plasticity
TLDR
By applying a cortically inspired self-adjusting network architecture, it is shown that the activity of generic spiking neural networks emulated on a neuromorphic hardware system can be kept within a biologically realistic firing regime and gain a remarkable robustness against transistor-level variations.
Neuromorphic Silicon Neuron Circuits
TLDR
The most common building blocks and techniques used to implement these circuits, and an overview of a wide range of neuromorphic silicon neurons, which implement different computational models, ranging from biophysically realistic and conductance-based Hodgkin–Huxley models to bi-dimensional generalized adaptive integrate and fire models.
Spiking neural networks as superior generative and discriminative models
TLDR
It is shown how short-term plasticity enables LIF networks to travel efficiently through the energy landscape and thereby attain a generative performance that far surpasses the one achievable by conventional Gibbs sampling.
A million spiking-neuron integrated circuit with a scalable communication network and interface
TLDR
Inspired by the brain’s structure, an efficient, scalable, and flexible non–von Neumann architecture is developed that leverages contemporary silicon technology and is well suited to many applications that use complex neural networks in real time, for example, multiobject detection and classification.
The high-conductance state enables neural sampling in networks of LIF neurons
TLDR
A normative framework for Bayesian inference in cortex is provided, but also powerful applications of low-power, accelerated neuromorphic systems to relevant machine learning tasks are provided.
...
...