Event-Based Moving Object Detection and Tracking

@article{Mitrokhin2018EventBasedMO,
  title={Event-Based Moving Object Detection and Tracking},
  author={Anton Mitrokhin and Cornelia Ferm{\"u}ller and Chethan Parameshwara and Yiannis Aloimonos},
  journal={2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  year={2018},
  pages={1-9}
}
Event-based vision sensors, such as the Dynamic Vision Sensor (DVS), are ideally suited for real-time motion analysis. The unique properties encompassed in the readings of such sensors provide high temporal resolution, superior sensitivity to light and low latency. These properties provide the grounds to estimate motion efficiently and reliably in the most sophisticated scenarios, but these advantages come at a price - modern event-based vision sensors have extremely low resolution, produce a… Expand
Asynchronous Tracking-by-Detection on Adaptive Time Surfaces for Event-based Object Tracking
TLDR
An Adaptive Time-Surface with Linear Time Decay (ATSLTD) event-to-frame conversion algorithm, which asynchronously and effectively warps the spatio-temporal information of asynchronous retinal events to a sequence of ATSLTD frames with clear object contours to perform accurate and efficient object tracking. Expand
High-Speed Object Tracking with Dynamic Vision Sensor
TLDR
This work introduces a novel event coherence detection algorithm for high-speed objective tracking, which can accurately track the small objects with high speed and performs efficiently. Expand
Moving Object Detection for Event-based vision using Graph Spectral Clustering
TLDR
An unsupervised Graph Spectral Clustering technique for Moving Object Detection in Event-based data (GSCEventMOD) is presented and it is shown how the optimum number of moving objects can be automatically determined. Expand
e-TLD: Event-Based Framework for Dynamic Object Tracking
This paper presents a long-term object tracking framework with a moving event camera under general tracking conditions. A first of its kind for these revolutionary cameras, the tracking frameworkExpand
Event-based Vision: A Survey
TLDR
This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. Expand
Event-based Vision: A Survey.
Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output aExpand
Retinal Slip Estimation and Object Tracking with an Active Event Camera
TLDR
A retinal slip estimation algorithm and a novel tracking strategy for active event-based cameras that enables estimates of the normal component of the optical flow to be updated as spikes come in, eschewing the accumulation of events within a fixed temporal window. Expand
FAST-Dynamic-Vision: Detection and Tracking Dynamic Objects with Event and Depth Sensing
TLDR
This paper presents a complete perception system including ego-motion compensation, object detection, and trajectory prediction for fast-moving dynamic objects with low latency and high precision and proposes an optimizationbased approach that asynchronously fuses event and depth cameras for trajectory prediction. Expand
Event-Based Motion Segmentation by Motion Compensation
TLDR
This work presents the first per-event segmentation method for splitting a scene into independently moving objects, and shows the first quantitative evaluation of a segmentation algorithm for event cameras, yielding around 90% accuracy at 4 pixels relative displacement. Expand
An Asynchronous Real-Time Corner Extraction and Tracking Algorithm for Event Camera
TLDR
This paper proposes an asynchronous real-time corner extraction and tracking algorithm for an event camera that achieves excellent corner detection and tracking performance and can process more than 4.5 million events per second, showing promising potential in real- time computer vision applications. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 29 REFERENCES
Feature detection and tracking with the dynamic and active-pixel vision sensor (DAVIS)
TLDR
This work presents the first algorithm to detect and track visual features using both the frames and the event data provided by the DAVIS, a novel vision sensor which combines a standard camera and an asynchronous event-based sensor in the same pixel array. Expand
Independent motion detection with event-driven cameras
TLDR
The method detects and tracks corners in the event stream and learns the statistics of their motion as a function of the robot's joint velocities when no independently moving objects are present, and is robust to changes in speed of both the head and the target. Expand
Event-Based Visual Inertial Odometry
TLDR
This paper presents the first algorithm to fuse a purely event-based tracking algorithm with an inertial measurement unit, to provide accurate metric tracking of a cameras full 6dof pose. Expand
Real-Time Clustering and Multi-Target Tracking Using Event-Based Sensors
TLDR
This work presents a real-time clustering technique that takes advantage of the unique properties of event-based vision sensors that redefines the well-known mean-shift clustering method using asynchronous events instead of conventional frames. Expand
Contour Motion Estimation for Asynchronous Event-Driven Cameras
TLDR
Algorithms are presented for the estimation of accurate contour motion from local spatio-temporal information for two camera models: the dynamic vision sensor (DVS), which asynchronously records temporal changes of the luminance, and a family of new sensors which combine DVS data with intensity signals. Expand
Spatiotemporal multiple persons tracking using Dynamic Vision Sensor
TLDR
An algorithm for spatiotemporal tracking that is suitable for Dynamic Vision Sensor is introduced and the problem of multiple persons tracking in the occurrence of high occlusions is addressed. Expand
Real-time panoramic tracking for event cameras
TLDR
This work proposes a direct camera tracking formulation, similar to state-of-the-art in visual odometry, and shows that the minimal information needed for simultaneous tracking and mapping is the spatial position of events, without using the appearance of the imaged scene point. Expand
Embedded Vision System for Real-Time Object Tracking using an Asynchronous Transient Vision Sensor
  • M. Litzenberger, C. Posch, +4 authors H. Garn
  • Computer Science
  • 2006 IEEE 12th Digital Signal Processing Workshop & 4th IEEE Signal Processing Education Workshop
  • 2006
TLDR
An algorithm for object tracking with 1 millisecond timestamp resolution of the AER data stream is presented and the potential of the proposed algorithm for people tracking is shown. Expand
Simultaneous Mosaicing and Tracking with an Event Camera
TLDR
This work shows for the first time that an event stream, with no additional sensing, can be used to track accurate camera rotation while building a persistent and high quality mosaic of a scene which is super-resolution accurate and has high dynamic range. Expand
Asynchronous Event-Based Multikernel Algorithm for High-Speed Visual Features Tracking
This paper presents a number of new methods for visual tracking using the output of an event-based asynchronous neuromorphic dynamic vision sensor. It allows the tracking of multiple visual featuresExpand
...
1
2
3
...