Depth-aided robust localization approach for relative navigation using RGB-depth camera and LiDAR sensor

Abstract

This paper describes a robust localization approach for a moving target based on RGB-depth (RGB-D) camera and 2D light detection and ranging (LiDAR) sensor measurements. In the proposed approach, the 3D and 2D position information of a target measured by RGB-D camera and LiDAR sensor, respectively are utilized to find location of target by incorporating visual tracking algorithms, depth information of the structured light sensor and vision-LiDAR low-level fusion algorithm (e.g., extrinsic calibration). For robustness of localization, a novel approach making use of Kalman prediction and filtering with intermittent observations which are identified from depth image segmentation is proposed. The proposed depth-aided localization algorithm shows robust tracking results even if visual tracking using RGB camera fails. The experimental verification results are compared to position data from VICON motion captureas a ground truth and the results show that performance superiority and robustness of the proposed approach.

DOI: 10.1109/ICCAIS.2014.7020538

7 Figures and Tables

Cite this paper

@article{Song2014DepthaidedRL, title={Depth-aided robust localization approach for relative navigation using RGB-depth camera and LiDAR sensor}, author={Ha-Ryong Song and Won-sub Choi and Hae-dong Kim}, journal={The 2014 International Conference on Control, Automation and Information Sciences (ICCAIS 2014)}, year={2014}, pages={105-110} }