Retinomorphic Event-Based Vision Sensors: Bioinspired Cameras With Spiking Output
@article{Posch2014RetinomorphicEV, title={Retinomorphic Event-Based Vision Sensors: Bioinspired Cameras With Spiking Output}, author={Christoph Posch and Teresa Serrano-Gotarredona and Bernab{\'e} Linares-Barranco and Tobi Delbr{\"u}ck}, journal={Proceedings of the IEEE}, year={2014}, volume={102}, pages={1470-1484} }
State-of-the-art image sensors suffer from significant limitations imposed by their very principle of operation. These sensors acquire the visual information as a series of “snapshot” images, recorded at discrete points in time. Visual information gets time quantized at a predetermined frame rate which has no relation to the dynamics present in the scene. Furthermore, each recorded frame conveys the information from all pixels, regardless of whether this information, or a part of it, has…
Figures from this paper
210 Citations
Event-Based Vision: A Survey
- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2022
This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras.
Event-based Vision: A Survey
- Computer ScienceIEEE Trans. Pattern Anal. Mach. Intell.
- 2022
This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras.
Spike Coding for Dynamic Vision Sensor in Intelligent Driving
- Computer ScienceIEEE Internet of Things Journal
- 2019
A cube-based spike coding framework for DVS is introduced to exploit the spatial and temporal characteristics of spikes for compression, and achieves an impressive coding performance with the average compression ratio of 2.6536 against the raw spike data, which is much higher than the results of conventional lossless coding algorithms.
Neuromorphic Stereo Vision: A Survey of Bio-Inspired Sensors and Algorithms
- Computer ScienceFront. Neurorobot.
- 2019
This work investigates sensors and algorithms for event-based stereo vision leading to more biologically plausible robots and focuses mainly on binocular stereo vision.
Ultrafast machine vision with 2D material neural network image sensors
- Computer ScienceNature
- 2020
It is demonstrated that an image sensor can itself constitute an ANN that can simultaneously sense and process optical images without latency, and is trained to classify and encode images with high throughput, acting as an artificial neural network.
Spatiotemporal features for asynchronous event-based data
- Computer ScienceFront. Neurosci.
- 2015
A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection that learns distinct and task-specific dynamic visual features, and can predict their trajectories over time.
An Extended Modular Processing Pipeline for Event-Based Vision in Automatic Visual Inspection
- Computer ScienceSensors
- 2021
This paper evaluates current state-of-the-art processing algorithms in automatic visual inspection of Dynamic Vision Sensors and proposes an algorithmic approach for the identification of ideal time windows within an event stream for object classification.
Spike Coding for Dynamic Vision Sensors
- Computer Science2018 Data Compression Conference
- 2018
This paper firstly analyzes the spike firing mechanism and the redundancies of the spike data generated from DVS, and then introduces an efficient cube-based coding framework, which achieves an impressive coding performance.
Bio-Inspired Stereo Vision Calibration for Dynamic Vision Sensors
- Computer ScienceIEEE Access
- 2019
The result is a novel calibration technique for a neuromorphic stereo vision system, implemented over specialized hardware (FPGA - Field-Programmable Gate Array), which allows obtaining reduced latencies on hardware implementation for stand-alone systems, and working in real time.
Event-Based Line Fitting and Segment Detection Using a Neuromorphic Visual Sensor
- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2019
An event- based luminance-free algorithm for line and segment detection from the output of asynchronous event-based neuromorphic retinas that can be envisioned for high-speed applications, such as vision-based robotic navigation.
References
SHOWING 1-10 OF 74 REFERENCES
Fast Vision Through Frameless Event-Based Sensing and Convolutional Processing: Application to Texture Recognition
- Computer ScienceIEEE Transactions on Neural Networks
- 2010
A new event-based processing architecture is developed that emulates with AER hardware Manjunath's frame-based feature recognition software algorithm, and its performance is analyzed using a custom made event- based behavioral simulator.
Mapping from Frame-Driven to Frame-Free Event-Driven Vision Systems by Low-Rate Rate Coding and Coincidence Processing--Application to Feedforward ConvNets
- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2013
This paper presents a methodology for mapping from a properly trained neural network in a conventional frame-driven representation to an event- driven representation by studying event-driven convolutional neural networks (ConvNet) trained to recognize rotating human silhouettes or high speed poker card symbols.
Mapping from Frame-Driven to Frame-Free Event-Driven Vision Systems by Low-Rate Rate-Coding and Coincidence Processing. Application to Feed Forward ConvNets.
- Computer ScienceIEEE transactions on pattern analysis and machine intelligence
- 2013
This paper presents a methodology for mapping from a properly trained neural network in a conventional Frame-driven representation, to an Event- driven representation by studying Event-driven Convolutional Neural Networks (ConvNet) trained to recognize rotating human silhouettes or high speed poker card symbols.
A 64x64 aer logarithmic temporal derivative silicon retina
- EngineeringResearch in Microelectronics and Electronics, 2005 PhD
- 2005
Real time artificial vision is traditionally limited to the frame rate. In many scenarios most frames contain information redundant both within and across frames. Here we report on the development of…
The silicon retina.
- Computer ScienceScientific American
- 1991
A neuromorphic vision system consisting of analog VLSI (very-large silicon integration) neuromorphic chips and field-programmable gate array (FPGA) circuits, which consists of silicon retinas, ‘simple-cell’ chips and FPGA circuits.
A 128 128 120 dB 15 s Latency Asynchronous Temporal Contrast Vision Sensor
- Computer Science
- 2006
This silicon retina provides an attractive combination of characteristics for low-latency dynamic vision under uncontrolled illumination with low post-processing requirements by providing high pixel bandwidth, wide dynamic range, and precisely timed sparse digital output.
Sensitivity and uniformity of a 0.18µm CMOS temporal contrast pixel array
- Engineering2011 IEEE International Symposium of Circuits and Systems (ISCAS)
- 2011
The results are shown to agree well with predictions from theoretical considerations, validating the proposed test method and acquiring and evaluating these three performance parameters simultaneously.
A Spatial Contrast Retina With On-Chip Calibration for Neuromorphic Spike-Based AER Vision Systems
- EngineeringIEEE Transactions on Circuits and Systems I: Regular Papers
- 2007
A 32 times 32 pixels contrast retina microchip that provides its output as an address event representation (AER) stream is presented that shows a reduction in mismatch standard deviation from 57% to 6.6% (indoor light).
Real-time classification and sensor fusion with a spiking deep belief network
- Computer ScienceFront. Neurosci.
- 2013
This paper proposes a method based on the Siegert approximation for Integrate-and-Fire neurons to map an offline-trained DBN onto an efficient event-driven spiking neural network suitable for hardware implementation and shows that the system can be biased to select the correct digit from otherwise ambiguous input.
A 3.6 $\mu$ s Latency Asynchronous Frame-Free Event-Driven Dynamic-Vision-Sensor
- MathematicsIEEE Journal of Solid-State Circuits
- 2011
The ability of the sensor to capture very fast moving objects, rotating at 10 K revolutions per second, has been verified experimentally and a compact preamplification stage has been introduced that allows to improve the minimum detectable contrast over previous designs.