• Publications
  • Influence
Dense visual SLAM for RGB-D cameras
TLDR
This paper proposes a dense visual SLAM method for RGB-D cameras that minimizes both the photometric and the depth error over all pixels, and proposes an entropy-based similarity measure for keyframe selection and loop closure detection.
Robust odometry estimation for RGB-D cameras
TLDR
This work registers two consecutive RGB-D frames directly upon each other by minimizing the photometric error using non-linear minimization in combination with a coarse-to-fine scheme, and proposes to use a robust error function that reduces the influence of large residuals.
Real-Time Camera Tracking and 3D Reconstruction Using Signed Distance Functions
TLDR
This paper presents a novel method for real-time camera tracking and 3D reconstruction of static indoor environments using an RGB-D sensor that is more accurate and robust than the iterated closest point algorithm (ICP) used by KinectFusion, and yields often a comparable accuracy at much higher speed to feature-based bundle adjustment methods such asRGB-D SLAM.
Fast odometry and scene flow from RGB-D cameras based on geometric clustering
TLDR
An efficient solution to jointly estimate the camera motion and a piecewise-rigid scene flow from an RGB-D sequence by performing a two-fold segmentation of the scene, dividing it into geometric clusters that are, in turn, classified as static or moving elements.
CPA-SLAM: Consistent plane-model alignment for direct RGB-D SLAM
TLDR
A real-time capable RGB-D SLAM system that consistently integrates frame-to-keyframe and frame- to-plane alignment and uses the planes for tracking and global graph optimization in an expectation-maximization framework.
Large-Scale Multi-resolution Surface Reconstruction from RGB-D Sequences
TLDR
This work proposes a method to generate highly detailed, textured 3D models of large environments from RGB-D sequences that can reconstruct, store, and continuously update a colored 3D model of an entire corridor of nine rooms at high levels of detail in real-time on a single GPU with 2.5GB.
Dense Continuous-Time Tracking and Mapping with Rolling Shutter RGB-D Cameras
TLDR
This work parametrize the camera trajectory using continuous B-splines and optimize the trajectory through dense, direct image alignment, which demonstrates superior quality in tracking and reconstruction compared to approaches with discrete-time or global shutter assumptions.
Odometry from RGB-D Cameras for Autonomous Quadrocopters
TLDR
This master’s thesis presents a robust, dense visual odometry method applicable to the stabilization of quadrocopters by aligning consecutive images based on the photo-consistency assumption using all image information.
Multi-view deep learning for consistent semantic mapping with RGB-D cameras
TLDR
This paper proposes a novel deep neural network approach to predict semantic segmentation from RGB-D sequences and achieves state-of-the-art performance on the NYUDv2 dataset in single-view segmentation as well as multi-view semantic fusion.
libfreenect2: Release 0.2
...
...