Continuous-time Intensity Estimation Using Event Cameras

@article{Scheerlinck2018ContinuoustimeIE,
  title={Continuous-time Intensity Estimation Using Event Cameras},
  author={Cedric Scheerlinck and Nick Barnes and Robert E. Mahony},
  journal={ArXiv},
  year={2018},
  volume={abs/1811.00386}
}
Event cameras provide asynchronous, data-driven measurements of local temporal contrast over a large dynamic range with extremely high temporal resolution. Conventional cameras capture low-frequency reference intensity information. These two sensor modalities provide complementary information. We propose a computationally efficient, asynchronous filter that continuously fuses image frames and events into a single high-temporal-resolution, high-dynamic-range image state. In absence of… 
Learning to Super Resolve Intensity Images From Events
TLDR
This work proposes an end-to-end network to reconstruct high resolution, high dynamic range (HDR) images directly from the event stream and evaluates the algorithm on both simulated and real-world sequences to verify that it captures fine details of a scene and outperforms the combination of the state-of-the-art event to image algorithms with the state of the art super resolution schemes.
Robust Motion Compensation for Event Cameras With Smooth Constraint
TLDR
This work directly aligns the curved event trajectories with time-varying motion parameters and maximizing the image energy with smooth constraint on motion parameters so that the algorithm can perform accurately and robustly on event cameras.
Robust Intensity Image Reconstruciton Based On Event Cameras
TLDR
This paper proposes a variational model by using spatial smooth constraint regularization to recover clean image frames from blurry and noisy camera images and events at any frame rate and presents experimental results to demonstrate that the proposed algorithm is superior to the other reconstruction algorithms.
An Asynchronous Kalman Filter for Hybrid Event Cameras
TLDR
The proposed algorithm includes a frame augmentation pre-processing step that deblurs and temporally interpolates frame data using events and outperforms state-of-the-art methods in both absolute intensity error and image similarity indexes.
Event Camera Calibration of Per-pixel Biased Contrast Threshold
TLDR
A new event camera model and two calibration approaches which cover event-only cameras and hybrid image-event cameras are proposed and an efficient online method to calibrate event cameras that adapts to time-varying event rates is proposed.
Fast Image Reconstruction with an Event Camera
TLDR
A novel neural network architecture for video reconstruction from events that is smaller (38k vs. 10M parameters) and faster than state-of-the-art with minimal impact to performance is proposed.
High Speed and High Dynamic Range Video with an Event Camera
TLDR
This work proposes a novel recurrent network to reconstruct videos from a stream of events, and trains it on a large amount of simulated event data, and shows that off-the-shelf computer vision algorithms can be applied to the reconstructions and that this strategy consistently outperforms algorithms that were specifically designed for event data.
Intensity-Image Reconstruction for Event Cameras Using Convolutional Neural Network
TLDR
"event frames" are recovered from event streams in an attenuation method and they are fed into the U-net network to generate intensity images and compared with the target image on the simulated data and the real data proves that the model can reconstruct intensity images of event cameras very well.
High Frame Rate Video Reconstruction Based on an Event Camera
TLDR
This work proposes a simple yet effective approach to reconstruct high-quality and high frame rate sharp videos by associating event data to a latent sharp image and provides a new and more efficient solver to minimize the proposed energy model.
Event-based Vision: A Survey
TLDR
This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 37 REFERENCES
Asynchronous, Photometric Feature Tracking using Events and Frames
TLDR
This is the first principled method that uses raw intensity measurements directly, based on a generative event model within a maximum-likelihood framework, which produces feature tracks that are both more accurate and longer than the state of the art, across a wide variety of scenes.
Simultaneous Optical Flow and Intensity Estimation from an Event Camera
TLDR
This work proposes, to the best of the knowledge, the first algorithm to simultaneously recover the motion field and brightness image, while the camera undergoes a generic motion through any scene, within a sliding window time interval.
High-speed video generation with an event camera
TLDR
This work demonstrates how to use the event camera to generate high-speed videos of 2D motion augmented with foreground and background images taken from a conventional camera, and builds a parametric model of affine motion to create image sequences.
Direct face detection and video reconstruction from event cameras
TLDR
This work proposes and develops a patch-based model for the event streams acquired from event cameras, and demonstrates the first direct face detection from event streams, highlighting the potential of these event-based cameras for power-efficient vision applications.
Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios
TLDR
The first state estimation pipeline that leverages the complementary advantages of events, standard frames, and inertial measurements by fusing in a tightly coupled manner is presented, leading to an accuracy improvement of 130% over event-only pipelines, and 85% over standard-frames-only visual-inertial systems, while still being computationally tractable.
Simultaneous Mosaicing and Tracking with an Event Camera
TLDR
This work shows for the first time that an event stream, with no additional sensing, can be used to track accurate camera rotation while building a persistent and high quality mosaic of a scene which is super-resolution accurate and has high dynamic range.
Hybrid, Frame and Event based Visual Inertial Odometry for Robust, Autonomous Navigation of Quadrotors
TLDR
This paper presents the first state estimation pipeline that leverages the complementary advantages of these two sensors by fusing in a tightly-coupled manner events, standard frames, and inertial measurements, and demonstrates the first autonomous quadrotor flight using an event camera for state estimation.
Semi-Dense 3D Reconstruction with a Stereo Event Camera
TLDR
The proposed method consists of the optimization of an energy function designed to exploit small-baseline spatio-temporal consistency of events triggered across both stereo image planes to improve the density of the reconstruction and to reduce the uncertainty of the estimation.
Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera
TLDR
To the best of the knowledge, this is the first algorithm provably able to track a general 6D motion along with reconstruction of arbitrary structure including its intensity and the reconstruction of grayscale video that exclusively relies on event camera data.
The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM
TLDR
This paper presents and releases a collection of datasets captured with a DAVIS in a variety of synthetic and real environments, which it hopes will motivate research on new algorithms for high-speed and high-dynamic-range robotics and computer-vision applications.
...
1
2
3
4
...