Block-matching optical flow for dynamic vision sensors: Algorithm and FPGA implementation

@article{Liu2017BlockmatchingOF,
  title={Block-matching optical flow for dynamic vision sensors: Algorithm and FPGA implementation},
  author={Min Liu and Tobi Delbr{\"u}ck},
  journal={2017 IEEE International Symposium on Circuits and Systems (ISCAS)},
  year={2017},
  pages={1-4}
}
  • Min Liu, T. Delbrück
  • Published 6 January 2017
  • Computer Science
  • 2017 IEEE International Symposium on Circuits and Systems (ISCAS)
Rapid and low power computation of optical flow (OF) is potentially useful in robotics. The dynamic vision sensor (DVS) event camera produces quick and sparse output, and has high dynamic range, but conventional OF algorithms are frame-based and cannot be directly used with event-based cameras. Previous DVS OF methods do not work well with dense textured input and are designed for implementation in logic circuits. This paper proposes a new block-matching based DVS OF algorithm which is inspired… 

Figures and Tables from this paper

Adaptive Time-Slice Block-Matching Optical Flow Algorithm for Dynamic Vision Sensors
TLDR
This paper proposes an event-driven OF algorithm called adaptive block-matching optical flow (ABMOF), which uses time slices of accumulated DVS events and developed both ABMOF and Lucas-Kanade (LK) algorithms using the authors' adapted slices.
ABMOF: A Novel Optical Flow Algorithm for Dynamic Vision Sensors
TLDR
New adaptive time-slice rotation methods that ensure the generated slices have sufficient features for matching are introduced, including a feedback mechanism that controls the generate slices to have average slice displacement within the block search range.
Event-based Plane-fitting Optical Flow for Dynamic Vision Sensors in FPGA
TLDR
Modification and implementation of a well known "plane-fitting" approach to event-based optical flow estimation for Dynamic Vision Sensors is presented, and the FPGA implementation is shown to perform similarly to the previously published full precision software implementation.
hARMS: A Hardware Acceleration Architecture for Real-Time Event-Based Optical Flow
Event-based vision sensors produce asynchronous event streams with high temporal resolution based on changes in the visual scene. The properties of these sensors allow for accurate and fast
DAViS Camera Optical Flow
TLDR
This work proposes a novel optical flow method designed specifically for a DAViS camera that leverages the high spatial fidelity of intensity image frames and the high temporal resolution of events generated by DVS, and yields reliable motion vector estimates while overcoming the fast motion and occlusion problems.
Event-based Vision: A Survey
TLDR
This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras.
OpenCL-based FPGA accelerator for disparity map generation with stereoscopic event cameras
TLDR
A stereo matching algorithm is presented to create FPGA accelerators for a well-known vision algorithm using event-based cameras and a performance speedup of more than 8x can be achieved with simple code transformations.
Resource-Efficient and High-Throughput VLSI Design of Global Optical Flow Method for Mobile Systems
TLDR
A multirow-based propagation approach and an efficient VLSI architecture that divides an image into multiple small subimages and achieves higher throughput-per-watt compared to state-of-the-art designs of the global optical flow method.
Learning of Dense optical Flow, motion and depth, from Sparse Event Cameras
TLDR
This thesis demonstrates the feasibility of reconstructing dense depth, optical flow and motion information from a neuromorphic imaging device, called Dynamic Vision Sensor, and introduces the Evenly-Cascaded convolutional Network (ECN), a bioinspired multi-level, multi-resolution neural network architecture.
Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation
TLDR
This work reduced the dimensionality to two-dimensional motion estimation by transforming the event data to a bird’s-eye view using homography calculated from the event camera position, which mitigates the problem of the loss function becoming non-convex, which occurs in conventional methods.
...
1
2
3
...

References

SHOWING 1-10 OF 20 REFERENCES
On-board real-time optic-flow for miniature event-based vision sensors
  • J. Conradt
  • Computer Science
    2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)
  • 2015
TLDR
A miniaturized stand-alone embedded system that utilizes a novel neuro-biologically inspired event-based vision sensor (DVS) to extract optic flow on-board in real-time with minimal computing requirements is presented.
Block Matching Algorithms For Motion Estimation
TLDR
7 different types of block matching algorithms used for motion estimation in video compression are implemented and compared, ranging from the very basic Exhaustive Search to the recent fast adaptive algorithms like Adaptive Rood Pattern Search.
Evaluation of Event-Based Algorithms for Optical Flow with Ground-Truth from Inertial Measurement Sensor
TLDR
Nine optical flow algorithms that locally measure the flow normal to edges according to accuracy and computation cost are compared and a new source for the ground truth is introduced: gyro data from the inertial measurement unit integrated with the DAVIS camera provides a ground-truth to which algorithms that measure optical flow by means of motion cues are compared.
Asynchronous frameless event-based optical flow
A 128$\times$ 128 120 dB 15 $\mu$s Latency Asynchronous Temporal Contrast Vision Sensor
TLDR
This silicon retina provides an attractive combination of characteristics for low-latency dynamic vision under uncontrolled illumination with low post-processing requirements by providing high pixel bandwidth, wide dynamic range, and precisely timed sparse digital output.
Bio-inspired Motion Estimation with Event-Driven Sensors
TLDR
This work presents a simple method for locating regions of high-frequency texture, and a novel phase-based method for event sensors that estimates more accurately these regions, and evaluate and compare the results with other state-of-the-art techniques.
Lucas-Kanade 20 Years On: A Unifying Framework
TLDR
An overview of image alignment is presented, describing most of the algorithms and their extensions in a consistent framework and concentrating on the inverse compositional algorithm, an efficient algorithm that was recently proposed.
Block Matching Algorithms for Motion Estimation
TLDR
The key technique for improving the algorithm efficiency, analyses the correlative algorithms and compares their properties are revealed, and the future trend of motion estimation research is discussed.
A sum of absolute differences implementation in FPGA hardware
TLDR
A new hardware unit that performs a 16/spl times/1 SAD operation in field-programmable gate arrays (FPGA), because it provides increased flexibility, sufficient performance, and faster design times.
...
1
2
...