Retinomorphic Event-Based Vision Sensors: Bioinspired Cameras With Spiking Output

@article{Posch2014RetinomorphicEV,
  title={Retinomorphic Event-Based Vision Sensors: Bioinspired Cameras With Spiking Output},
  author={Christoph Posch and Teresa Serrano-Gotarredona and Bernab{\'e} Linares-Barranco and Tobi Delbr{\"u}ck},
  journal={Proceedings of the IEEE},
  year={2014},
  volume={102},
  pages={1470-1484}
}
State-of-the-art image sensors suffer from significant limitations imposed by their very principle of operation. These sensors acquire the visual information as a series of “snapshot” images, recorded at discrete points in time. Visual information gets time quantized at a predetermined frame rate which has no relation to the dynamics present in the scene. Furthermore, each recorded frame conveys the information from all pixels, regardless of whether this information, or a part of it, has… 
Event-Based Vision: A Survey
TLDR
This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras.
Event-based Vision: A Survey
TLDR
This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras.
Spike Coding for Dynamic Vision Sensor in Intelligent Driving
TLDR
A cube-based spike coding framework for DVS is introduced to exploit the spatial and temporal characteristics of spikes for compression, and achieves an impressive coding performance with the average compression ratio of 2.6536 against the raw spike data, which is much higher than the results of conventional lossless coding algorithms.
Neuromorphic Stereo Vision: A Survey of Bio-Inspired Sensors and Algorithms
TLDR
This work investigates sensors and algorithms for event-based stereo vision leading to more biologically plausible robots and focuses mainly on binocular stereo vision.
Ultrafast machine vision with 2D material neural network image sensors
TLDR
It is demonstrated that an image sensor can itself constitute an ANN that can simultaneously sense and process optical images without latency, and is trained to classify and encode images with high throughput, acting as an artificial neural network.
Spatiotemporal features for asynchronous event-based data
TLDR
A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection that learns distinct and task-specific dynamic visual features, and can predict their trajectories over time.
An Extended Modular Processing Pipeline for Event-Based Vision in Automatic Visual Inspection
TLDR
This paper evaluates current state-of-the-art processing algorithms in automatic visual inspection of Dynamic Vision Sensors and proposes an algorithmic approach for the identification of ideal time windows within an event stream for object classification.
Spike Coding for Dynamic Vision Sensors
TLDR
This paper firstly analyzes the spike firing mechanism and the redundancies of the spike data generated from DVS, and then introduces an efficient cube-based coding framework, which achieves an impressive coding performance.
Bio-Inspired Stereo Vision Calibration for Dynamic Vision Sensors
TLDR
The result is a novel calibration technique for a neuromorphic stereo vision system, implemented over specialized hardware (FPGA - Field-Programmable Gate Array), which allows obtaining reduced latencies on hardware implementation for stand-alone systems, and working in real time.
Event-Based Line Fitting and Segment Detection Using a Neuromorphic Visual Sensor
TLDR
An event- based luminance-free algorithm for line and segment detection from the output of asynchronous event-based neuromorphic retinas that can be envisioned for high-speed applications, such as vision-based robotic navigation.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 74 REFERENCES
Fast Vision Through Frameless Event-Based Sensing and Convolutional Processing: Application to Texture Recognition
TLDR
A new event-based processing architecture is developed that emulates with AER hardware Manjunath's frame-based feature recognition software algorithm, and its performance is analyzed using a custom made event- based behavioral simulator.
Mapping from Frame-Driven to Frame-Free Event-Driven Vision Systems by Low-Rate Rate Coding and Coincidence Processing--Application to Feedforward ConvNets
TLDR
This paper presents a methodology for mapping from a properly trained neural network in a conventional frame-driven representation to an event- driven representation by studying event-driven convolutional neural networks (ConvNet) trained to recognize rotating human silhouettes or high speed poker card symbols.
Mapping from Frame-Driven to Frame-Free Event-Driven Vision Systems by Low-Rate Rate-Coding and Coincidence Processing. Application to Feed Forward ConvNets.
TLDR
This paper presents a methodology for mapping from a properly trained neural network in a conventional Frame-driven representation, to an Event- driven representation by studying Event-driven Convolutional Neural Networks (ConvNet) trained to recognize rotating human silhouettes or high speed poker card symbols.
A 64x64 aer logarithmic temporal derivative silicon retina
Real time artificial vision is traditionally limited to the frame rate. In many scenarios most frames contain information redundant both within and across frames. Here we report on the development of
The silicon retina.
TLDR
A neuromorphic vision system consisting of analog VLSI (very-large silicon integration) neuromorphic chips and field-programmable gate array (FPGA) circuits, which consists of silicon retinas, ‘simple-cell’ chips and FPGA circuits.
A 128 128 120 dB 15 s Latency Asynchronous Temporal Contrast Vision Sensor
TLDR
This silicon retina provides an attractive combination of characteristics for low-latency dynamic vision under uncontrolled illumination with low post-processing requirements by providing high pixel bandwidth, wide dynamic range, and precisely timed sparse digital output.
Sensitivity and uniformity of a 0.18µm CMOS temporal contrast pixel array
  • C. Posch, D. Matolin
  • Engineering
    2011 IEEE International Symposium of Circuits and Systems (ISCAS)
  • 2011
TLDR
The results are shown to agree well with predictions from theoretical considerations, validating the proposed test method and acquiring and evaluating these three performance parameters simultaneously.
A Spatial Contrast Retina With On-Chip Calibration for Neuromorphic Spike-Based AER Vision Systems
TLDR
A 32 times 32 pixels contrast retina microchip that provides its output as an address event representation (AER) stream is presented that shows a reduction in mismatch standard deviation from 57% to 6.6% (indoor light).
Real-time classification and sensor fusion with a spiking deep belief network
TLDR
This paper proposes a method based on the Siegert approximation for Integrate-and-Fire neurons to map an offline-trained DBN onto an efficient event-driven spiking neural network suitable for hardware implementation and shows that the system can be biased to select the correct digit from otherwise ambiguous input.
A 3.6 $\mu$ s Latency Asynchronous Frame-Free Event-Driven Dynamic-Vision-Sensor
TLDR
The ability of the sensor to capture very fast moving objects, rotating at 10 K revolutions per second, has been verified experimentally and a compact preamplification stage has been introduced that allows to improve the minimum detectable contrast over previous designs.
...
1
2
3
4
5
...