Robust visual localization in changing lighting conditions

Abstract

We present an illumination-robust visual localization algorithm for Astrobee, a free-flying robot designed to autonomously navigate on the International Space Station (ISS). Astrobee localizes with a monocular camera and a pre-built sparse map composed of natural visual features. Astrobee must perform tasks not only during the day, but also at night when the ISS lights are dimmed. However, the localization performance degrades when the observed lighting conditions differ from the conditions when the sparse map was built. We investigate and quantify the effect of lighting variations on visual feature-based localization systems, and discover that maps built in darker conditions can also be effective in bright conditions, but the reverse is not true. We extend Astrobee's localization algorithm to make it more robust to changing-light environments on the ISS by automatically recognizing the current illumination level, and selecting an appropriate map and camera exposure time. We extensively evaluate the proposed algorithm through experiments on Astrobee.

DOI: 10.1109/ICRA.2017.7989640

14 Figures and Tables

Cite this paper

@article{Kim2017RobustVL, title={Robust visual localization in changing lighting conditions}, author={Pyojin Kim and Brian Coltin and Oleg Alexandrov and H. Jin Kim}, journal={2017 IEEE International Conference on Robotics and Automation (ICRA)}, year={2017}, pages={5447-5452} }