The Multivehicle Stereo Event Camera Dataset: An Event Camera Dataset for 3D Perception

@article{Zhu2018TheMS,
  title={The Multivehicle Stereo Event Camera Dataset: An Event Camera Dataset for 3D Perception},
  author={A. Z. Zhu and Dinesh Thakur and Tolga {\"O}zaslan and Bernd Pfrommer and Vijay R. Kumar and Kostas Daniilidis},
  journal={IEEE Robotics and Automation Letters},
  year={2018},
  volume={3},
  pages={2032-2039}
}
Event-based cameras are a new passive sensing modality with a number of benefits over traditional cameras, including extremely low latency, asynchronous data acquisition, high dynamic range, and very low power consumption. There has been a lot of recent interest and development in applying algorithms to use the events to perform a variety of three-dimensional perception tasks, such as feature tracking, visual odometry, and stereo depth estimation. However, there currently lacks the wealth of… Expand
Event-based Vision: A Survey
TLDR
This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. Expand
Event-based Vision: A Survey.
Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output aExpand
DSEC: A Stereo Event Camera Dataset for Driving Scenarios
TLDR
This work presents the first high resolution, large scale stereo dataset with event cameras, DSEC, which contains 53 sequences collected by driving in a variety of illumination conditions and provides ground truth disparity for the development and evaluation of event-based stereo algorithms. Expand
TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes. They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamicExpand
Feature-based Event Stereo Visual Odometry
TLDR
This paper proposes a novel stereo visual odometry method for event cameras based on feature detection and matching with careful feature management, while pose estimation is done by reprojection error minimization, and shows on par performance on the MVSEC sequences. Expand
Semi-Dense 3D Reconstruction with a Stereo Event Camera
TLDR
The proposed method consists of the optimization of an energy function designed to exploit small-baseline spatio-temporal consistency of events triggered across both stereo image planes to improve the density of the reconstruction and to reduce the uncertainty of the estimation. Expand
Stereo dense depth tracking based on optical flow using frames and events
TLDR
This work proposes to estimate dense disparity from standard frames at the point of their availability, predict the disparity using odometry information, and track the disparity asynchronously using optical flow of events between the standard frames. Expand
CED: Color Event Camera Dataset
TLDR
This work presents and releases the first Color Event Camera Dataset (CED), containing 50 minutes of footage with both color frames and events, and presents an extension of the event camera simulator ESIM that enables simulation of color events. Expand
ESIM: an Open Event Camera Simulator
TLDR
This work presents the first event camera simulator that can generate a large amount of reliable event data, and releases an open source implementation of the simulator, which is a theoretically sound, adaptive rendering scheme that only samples frames when necessary. Expand
Video to Events: Bringing Modern Computer Vision Closer to Event Cameras
TLDR
This paper presents a method that addresses these needs by converting any existing video dataset recorded with conventional cameras to synthetic event data, which unlocks the use of a virtually unlimited number of existing video datasets for training networks designed for real event data. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
Continuous-Time Visual-Inertial Odometry for Event Cameras
TLDR
This paper is the first work on visual-inertial fusion with event cameras using a continuous-time framework and shows that the method provides improved accuracy over the result of a state-of-the-art visual odometry method for event cameras. Expand
The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM
TLDR
This paper presents and releases a collection of datasets captured with a DAVIS in a variety of synthetic and real environments, which it hopes will motivate research on new algorithms for high-speed and high-dynamic-range robotics and computer-vision applications. Expand
Event-Based Stereo Matching Approaches for Frameless Address Event Stereo Data
TLDR
This work used two silicon retina cameras as a stereo sensor setup for 3D reconstruction of the observed scene, as already known from conventional cameras, and developed an area-based, an event-image- based, and a time-based approach for stereo matching. Expand
Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization
TLDR
A novel, accurate tightly-coupled visual-inertial odometry pipeline for event cameras that leverages their outstanding properties to estimate the camera ego-motion in challenging conditions, such as high-speed motion or high dynamic range scenes. Expand
Event-Based Visual Inertial Odometry
TLDR
This paper presents the first algorithm to fuse a purely event-based tracking algorithm with an inertial measurement unit, to provide accurate metric tracking of a cameras full 6dof pose. Expand
Cooperative and asynchronous stereo vision for dynamic vision sensors
TLDR
This work proposes to model the event-driven stereo matching by a cooperative network that serves as spatiotemporal context used in disparity calculation for each incoming event and proves that the proposed approach is well adapted for DVS data and can be successfully used for disparity calculation. Expand
EVO: A Geometric Approach to Event-Based 6-DOF Parallel Tracking and Mapping in Real Time
TLDR
Evo, an event-based visual odometry algorithm that successfully leverages the outstanding properties of event cameras to track fast camera motions while recovering a semidense three-dimensional map of the environment, makes significant progress in simultaneous localization and mapping. Expand
Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera
TLDR
To the best of the knowledge, this is the first algorithm provably able to track a general 6D motion along with reconstruction of arbitrary structure including its intensity and the reconstruction of grayscale video that exclusively relies on event camera data. Expand
Low-latency visual odometry using event-based feature tracks
TLDR
This paper presents a low-latency visual odometry algorithm for the DAVIS sensor using event-based feature tracks that tightly interleaves robust pose optimization and probabilistic mapping and shows that the method successfully tracks the 6-DOF motion of the sensor in natural scenes. Expand
DDD17: End-To-End DAVIS Driving Dataset
TLDR
This work performs a preliminary end-to-end learning study of using a convolutional neural network that is trained to predict the instantaneous steering angle from DVS and APS visual data. Expand
...
1
2
3
4
...