Event based visual attention with dynamic neural field on FPGA

Abstract

Dynamic Field Theory (DFT) is an established framework for neuro-modeling or neuro-inspired computing, well suited for challenging perception and motion related tasks. However, their computational requirements, distributed storage and bandwidth needs make them difficult to design for real-world environments. In this paper, the digital hardware implementation of an event-based dynamic neural field for object tracking and attention is presented. To make computation less complex and hardware-friendly, some optimization on the weights and the neuron model were conducted on the Dynamic Neural Field (DNF) model under a spiking-based computation approach. In a proof-of-concept prototype we show how this derived Spiking DNF (SDNF) core can be interfaced to a Dynamic Vision Sensor (DVS) silicon retina and integrated into a more complex architecture able to perform selective attention.

DOI: 10.1145/2967413.2967443

Extracted Key Phrases

7 Figures and Tables

Cite this paper

@inproceedings{Vangel2016EventBV, title={Event based visual attention with dynamic neural field on FPGA}, author={Beno{\^i}t Chappet de Vangel and C{\'e}sar Torres-Huitzil and Bernard Girau}, booktitle={ICDSC}, year={2016} }