Comparing Representations in Tracking for Event Camera-based SLAM

  title={Comparing Representations in Tracking for Event Camera-based SLAM},
  author={Jianhao Jiao and Huaiyang Huang and Liang Li and Zhijian He and Yilong Zhu and Ming Liu},
  journal={2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)},
  • Jianhao Jiao, Huaiyang Huang, +3 authors Ming Liu
  • Published 2021
  • Computer Science
  • 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
This paper investigates two typical image-type representations for event camera-based tracking: time surface (TS) and event map (EM). Based on the original TS-based tracker, we make use of these two representations’ complementary strengths to develop an enhanced version. The pro-posed tracker consists of a general strategy to evaluate the optimization problem’s degeneracy online and then switch proper representations. Both TS and EM are motion- and scene-dependent, and thus it is important to… Expand
1 Citations

Figures and Tables from this paper

Dynamic Event Camera Calibration
This work presents the first dynamic event camera calibration algorithm, which calibrates directly from events captured during relative motion between camera and calibration pattern, and leverages existing calibration tools before optimizing all parameters through a multi-segment continuous-time formulation. Expand


EVO: A Geometric Approach to Event-Based 6-DOF Parallel Tracking and Mapping in Real Time
Evo, an event-based visual odometry algorithm that successfully leverages the outstanding properties of event cameras to track fast camera motions while recovering a semidense three-dimensional map of the environment, makes significant progress in simultaneous localization and mapping. Expand
Event-Based Visual Inertial Odometry
This paper presents the first algorithm to fuse a purely event-based tracking algorithm with an inertial measurement unit, to provide accurate metric tracking of a cameras full 6dof pose. Expand
Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization
A novel, accurate tightly-coupled visual-inertial odometry pipeline for event cameras that leverages their outstanding properties to estimate the camera ego-motion in challenging conditions, such as high-speed motion or high dynamic range scenes. Expand
Event-based, Direct Camera Tracking from a Photometric 3D Map using Nonlinear Optimization
This work presents a method to track the 6-DOF pose of an event camera in a known environment, which is contemplated to be described by a photometric 3D map built via classic dense 3D reconstruction algorithms. Expand
Globally-Optimal Event Camera Motion Estimation
The present paper looks at fronto-parallel motion estimation of an event camera, and derives a globally optimal solution to this generally non-convex problem, and removes the dependency on a good initial guess. Expand
Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera
To the best of the knowledge, this is the first algorithm provably able to track a general 6D motion along with reconstruction of arbitrary structure including its intensity and the reconstruction of grayscale video that exclusively relies on event camera data. Expand
Event-based feature tracking with probabilistic data association
A novel soft data association modeled with probabilities is introduced that enables a varying temporal integration different for every feature and sized inversely proportional to the length of the flow. Expand
A Unifying Contrast Maximization Framework for Event Cameras, with Applications to Motion, Depth, and Optical Flow Estimation
The main idea of the framework is to find the point trajectories on the image plane that are best aligned with the event data by maximizing an objective function: the contrast of an image of warped events. Expand
Event-Based, 6-DOF Camera Tracking from Photometric Depth Maps
This paper tackles the problem of accurate, low-latency tracking of an event camera from an existing photometric depth map built via classic dense reconstruction pipelines, and tracks the 6-DOF pose of the event camera upon the arrival of each event, thus virtually eliminating latency. Expand
Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios
The first state estimation pipeline that leverages the complementary advantages of events, standard frames, and inertial measurements by fusing in a tightly coupled manner is presented, leading to an accuracy improvement of 130% over event-only pipelines, and 85% over standard-frames-only visual-inertial systems, while still being computationally tractable. Expand