Asynchronous Spatial Image Convolutions for Event Cameras

@article{Scheerlinck2019AsynchronousSI,
  title={Asynchronous Spatial Image Convolutions for Event Cameras},
  author={Cedric Scheerlinck and Nick Barnes and Robert E. Mahony},
  journal={IEEE Robotics and Automation Letters},
  year={2019},
  volume={4},
  pages={816-822}
}
Spatial convolution is arguably the most fundamental of two-dimensional image processing operations. Conventional spatial image convolution can only be applied to a conventional image, that is, an array of pixel values (or similar image representation) that are associated with a single instant in time. Event cameras have serial, asynchronous output with no natural notion of an image frame, and each event arrives with a different timestamp. In this letter, we propose a method to compute the… 

Figures from this paper

Event-based Vision: A Survey
TLDR
This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras.
Event-Based Vision: A Survey
TLDR
This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras.
Fast Image Reconstruction with an Event Camera
TLDR
A novel neural network architecture for video reconstruction from events that is smaller (38k vs. 10M parameters) and faster than state-of-the-art with minimal impact to performance is proposed.
Bringing a Blurry Frame Alive at High Frame-Rate With an Event Camera
TLDR
This paper proposes a simple and effective approach, the Event-based Double Integral (EDI) model, to reconstruct a high frame-rate, sharp video from a single blurry frame and its event data, based on solving a simple non-convex optimization problem in a single scalar variable.
CED: Color Event Camera Dataset
TLDR
This work presents and releases the first Color Event Camera Dataset (CED), containing 50 minutes of footage with both color frames and events, and presents an extension of the event camera simulator ESIM that enables simulation of color events.
High Frame Rate Video Reconstruction Based on an Event Camera
TLDR
This work proposes a simple yet effective approach to reconstruct high-quality and high frame rate sharp videos by associating event data to a latent sharp image and provides a new and more efficient solver to minimize the proposed energy model.
An Asynchronous Kalman Filter for Hybrid Event Cameras
TLDR
The proposed algorithm includes a frame augmentation pre-processing step that deblurs and temporally interpolates frame data using events and outperforms state-of-the-art methods in both absolute intensity error and image similarity indexes.
luvHarris: A Practical Corner Detector for Event-cameras
TLDR
This paper presents a method to perform corner detection, dubbed look-up event-Harris (luvHarris), that employs the Harris algorithm for high accuracy but manages an improved event throughput, and explains the considerations taken and discusses the validity of the proposed approach for event cameras.
Event Camera Calibration of Per-pixel Biased Contrast Threshold
TLDR
A new event camera model and two calibration approaches which cover event-only cameras and hybrid image-event cameras are proposed and an efficient online method to calibrate event cameras that adapts to time-varying event rates is proposed.
Asynchronous event feature generation and tracking based on gradient descriptor for event cameras
TLDR
The experimental results show that the proposed method achieves improvement in terms of tracking accuracy and real-time performance when compared with the state-of-the-art asynchronous event-corner tracker and with no compromise on the feature tracking lifetime.
...
1
2
3
...

References

SHOWING 1-10 OF 40 REFERENCES
Simultaneous Optical Flow and Intensity Estimation from an Event Camera
TLDR
This work proposes, to the best of the knowledge, the first algorithm to simultaneously recover the motion field and brightness image, while the camera undergoes a generic motion through any scene, within a sliding window time interval.
Simultaneous Mosaicing and Tracking with an Event Camera
TLDR
This work shows for the first time that an event stream, with no additional sensing, can be used to track accurate camera rotation while building a persistent and high quality mosaic of a scene which is super-resolution accurate and has high dynamic range.
Real-Time Intensity-Image Reconstruction for Event Cameras Using Manifold Regularisation
TLDR
This work proposes a variational model that accurately models the behaviour of event cameras, enabling reconstruction of intensity images with arbitrary frame rate in real-time and verifies that solving the variations on the manifold produces high-quality images without explicitly estimating optical flow.
Asynchronous Event-Based Fourier Analysis
TLDR
A method to compute the FFT of a visual scene at a high temporal precision of around 1- $\mu \text{s}$ output from an asynchronous event-based camera and shows that for reasonable levels of approximations at equivalent frame rates beyond the millisecond, the method performs faster and more efficiently than conventional image acquisition.
Fast Event-based Corner Detection
TLDR
This work proposes a method to reduce an event stream to a corner event stream, which is capable of pro- cessing millions of events per second on a single core and reduces the event rate by a factor of 10 to 20.
Bringing a Blurry Frame Alive at High Frame-Rate With an Event Camera
TLDR
This paper proposes a simple and effective approach, the Event-based Double Integral (EDI) model, to reconstruct a high frame-rate, sharp video from a single blurry frame and its event data, based on solving a simple non-convex optimization problem in a single scalar variable.
Direct face detection and video reconstruction from event cameras
TLDR
This work proposes and develops a patch-based model for the event streams acquired from event cameras, and demonstrates the first direct face detection from event streams, highlighting the potential of these event-based cameras for power-efficient vision applications.
Continuous-Time Trajectory Estimation for Event-based Vision Sensors
TLDR
This paper addresses ego-motion estimation for an event-based vision sensor using a continuous-time framework to directly integrate the information conveyed by the sensor.
The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM
TLDR
This paper presents and releases a collection of datasets captured with a DAVIS in a variety of synthetic and real environments, which it hopes will motivate research on new algorithms for high-speed and high-dynamic-range robotics and computer-vision applications.
A Unifying Contrast Maximization Framework for Event Cameras, with Applications to Motion, Depth, and Optical Flow Estimation
TLDR
The main idea of the framework is to find the point trajectories on the image plane that are best aligned with the event data by maximizing an objective function: the contrast of an image of warped events.
...
1
2
3
4
...