A State-of-the-Art Review on Mapping and Localization of Mobile Robots Using Omnidirectional Vision Sensors

  title={A State-of-the-Art Review on Mapping and Localization of Mobile Robots Using Omnidirectional Vision Sensors},
  author={Luis Pay{\'a} and Arturo Gil and {\'O}scar Reinoso},
  journal={J. Sensors},
Nowadays, the field of mobile robotics is experiencing a quick evolution, and a variety of autonomous vehicles is available to solve different tasks. The advances in computer vision have led to a substantial increase in the use of cameras as the main sensors in mobile robots. They can be used as the only source of information or in combination with other sensors such as odometry or laser. Among vision systems, omnidirectional sensors stand out due to the richness of the information they provide… 

Figures and Tables from this paper

Adaptive Multimodal Localisation Techniques for Mobile Robots in Unstructured Environments : A Review

This paper will review recent efforts to develop onboard navigation systems which can seamlessly transition between outdoor and indoor environments and different terrains seamlessly and provide an analysis of the most common sensor modalities and the factors affecting sensor uncertainty for the same.

An Evaluation of New Global Appearance Descriptor Techniques for Visual Localization in Mobile Robots under Changing Lighting Conditions

This work focuses on the use of an omnidirectional vision sensor as unique source of information and global appearance to describe the visual information to carry out localization and map creation in highly heterogeneous zones.

A Low Cost Ultrasound-based Localisation System for Ground Robotics

The effectiveness of the proposed localization system is demonstrated by integrating it with a mobile ground robot to enable waypointbased path following.

A Deep Learning Tool to Solve Localization in Mobile Autonomous Robotics

The results show that the proposed deep learning tool is an efficient solution to carry out visual localization tasks, globally, as an image retrieval problem and hierarchically.

Sensors, SLAM and Long-term Autonomy: A Review

This paper attempts to review, discuss, evaluate, evaluate and compare these sensors, and assesses the characteristics of these sensors against factors critical to the long-term autonomy challenge.

Active Perception for Outdoor Localisation with an Omnidirectional Camera

A novel localisation framework based on an omnidirectional camera, targeted at outdoor urban environments that relies on highlevel persistent semantic features of the environment and offers an opportunity to carry out localisation on a prebuilt map, which is significantly more resource efficient and robust.

Research studio for testing control algorithms of mobile robots

The article presents the Laboratory of Intelligent Mobile Robots equipped with the latest solutions, equipped with rich sensor sets, a ground control station with Matlab-Simulink software, OptiTRACK object tracking system, and the necessary infrastructure for communication and security.

A robust method for 2D occupancy map building for indoor robot navigation

An efficient method to provide a robust occupancy grid useful for robot navigation tasks and enables to provide robust occupancy maps ensuring high performance in terms of processing time is proposed.



Using Omnidirectional Vision to Create a Model of the Environment: A Comparative Evaluation of Global-Appearance Descriptors

This work carries out a comparative evaluation of four global-appearance techniques in map building tasks, using omnidirectional visual information as the only source of data from the environment.

Estimation of Visual Maps with a Robot Network Equipped with Vision Sensors

An algorithm is proposed that uses the measurements obtained by the robots to build a single accurate map of the environment and is based on a Rao-Blackwellized particle filter that estimates the paths of the robots and the position of the visual landmarks.

Image-based memory for robot navigation using properties of omnidirectional images

SLAM in Indoor Environments using Omni-directional Vertical and Horizontal Line Features

This paper presents a new SLAM method which uses vertical lines extracted from an omni-directional camera image and horizontal lines from the range sensor data which reduces the effect of illumination and partial occlusion.

Appearance-based approach to hybrid metric-topological simultaneous localisation and mapping

A unified framework to carry out the simultaneous localisation and mapping of a mobile robot combining metric and topological techniques is presented and shows how it is possible to obtain a reasonable performance both in time and accuracy in an indoor environment, when the involved parameters are properly tuned.

Performance of Global-Appearance Descriptors in Map Building and Localization Using Omnidirectional Vision

This paper makes an exhaustive comparison among some global appearance descriptors to solve the mapping and localization problem and makes use of several image sets captured in indoor environments under realistic working conditions.

Position Estimation and Local Mapping Using Omnidirectional Images and Global Appearance Descriptors

This work presents some methods to create local maps and to estimate the position of a mobile robot, using the global appearance of omnidirectional images, and shows the effectiveness and the robustness of both methods.

Vision-based robot navigation and map building using active laser projection

This paper presents a vision-based approach to mobile robot navigation and map building using laser projection and has been successfully validated by navigating a custom mobile robot equipped with the laser-vision system in an indoor laboratory environment.

Vision-based navigation and environmental representations with an omnidirectional camera

A method for the visual-based navigation of a mobile robot in indoor environments, using a single omnidirectional (catadioptric) camera is proposed, which significantly simplifies the solution to navigation problems, by eliminating any perspective effects.

SURF features for efficient robot localization with omnidirectional images

The use of a recently developed feature, SURF, is proposed to improve the performance of appearance-based localization methods that perform image retrieval in large data sets, showing the use of SURF as the best compromise between efficiency and accuracy in the results.