A 3D Omnidirectional Sensor For Mobile Robot Applications

@inproceedings{Boutteau2010A3O,
  title={A 3D Omnidirectional Sensor For Mobile Robot Applications},
  author={R{\'e}mi Boutteau and Xavier Savatier and Jean-Yves Ertaud and B{\'e}lahc{\`e}ne Mazari},
  year={2010}
}
In most of the missions a mobile robot has to achieve – intervention in hostile environments, preparation of military intervention, mapping, etc – two main tasks have to be completed: navigation and 3D environment perception. Therefore, vision based solutions have been widely used in autonomous robotics because they provide a large amount of information useful for detection, tracking, pattern recognition and scene understanding. Nevertheless, the main limitations of this kind of system are the… 

A Vision-Based System for Robot Localization in Large Industrial Environments

TLDR
A vision-based system to localize mobile robots in large industrial environments using fisheye cameras to have a large field of view and the associated algorithms and several calibration methods that evaluate them with a ground-truth obtained by a motion capture system.

From local visual homing towards navigation of autonomous cleaning robots

TLDR
These navigation methods are the first application of omnidirectional vision, dense topo-metric maps, and local visual homing for the control of cleaning robots and are parsimonious yet robust and accurate methods for partial ego-motion estimation from visual information.

Fast omni-image unwarping using pano-mapping pointers array

TLDR
A computationally efficient alternative referred as pano-mapping pointers array (PMPA) is proposed, where each entry points to a specific place in the buffer depending on the interpolation desired (nearest neighbors or bilinear interpolation).

A direct approach for face detection on omnidirectional images

TLDR
A direct approach to tackle face detection on catadioptric images with no geometrical transformations is presented, and a new method to synthesize large omnidirectional images database is exposed.

A new approach for face detection with omnidirectional sensors

TLDR
The proposed approach is able to match the speed and robustness of existing face detection algorithm for conventional cameras and was able to train the detector on the synthesized omnidirectional image database.

Intelligent Transportation Scheme for Autonomous Vehicle in Smart Campus

TLDR
An integrated system whose first design approach is presented in this paper is presented along with the analysis of integration techniques and protocols to develop a smart vehicle inside a state of the art communication environment which is a Smart Campus.

A framework for face detection on Central Catadioptric Systems

TLDR
A method of adaptation of the well-known Viola and Jones' face detector to Central Catadioptric Systems is presented and it is demonstrated that spherical projection gives better results than cylindrical projection.

References

SHOWING 1-10 OF 31 REFERENCES

Introduction à la vision panoramique catadioptrique* Introduction to the catadioptric panoramic vision

TLDR
The omnidirectional vision is introduced and a fast outline of the existing geometrical models and calibration methods are given in order to make easier the reading of the articles proposed in this journal.

01 - Introduction à la vision panoramique catadioptrique

TLDR
The omnidirectional vision is introduced and a fast outline of the existing geometrical models and calibration methods are given in order to make easier the reading of the articles proposed in this journal.

Robust scene reconstruction from an omnidirectional vision system

TLDR
An efficient multi-baseline stereo algorithm for panoramic image data is presented that derives a parameterization of epipolar curves in terms of inverse depth, thus bypassing the need to perform explicit stereoscopic triangulation.

A Flexible Technique for Accurate Omnidirectional Camera Calibration and Structure from Motion

TLDR
Compared with classical techniques, which rely on a specific parametric model of the omnidirectional camera, the proposed procedure is independent of the sensor, easy to use, and flexible.

A Theory of Single-Viewpoint Catadioptric Image Formation

TLDR
This paper derives the complete class of single-lens single-mirror catadioptric sensors that have a single viewpoint, and describes all of the solutions in detail, including the degenerate ones, with reference to many of the catadi optric systems that have been proposed in the literature.

Calibration of omnidirectional stereo for mobile robots

  • Y. NegishiJ. MiuraY. Shirai
  • Physics
    2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566)
  • 2004
TLDR
Experimental results show the effectiveness of the proposed calibration method, and the system uses a pair of vertically-aligned catadioptric omnidirectional cameras, each of which is composed of a perspective camera and a hyperboloidal mirror, thus providing a single projection point.

Single View Point Omnidirectional Camera Calibration from Planar Grids

  • Christopher MeiP. Rives
  • Computer Science
    Proceedings 2007 IEEE International Conference on Robotics and Automation
  • 2007
TLDR
This paper presents a flexible approach for calibrating omnidirectional single viewpoint sensors from planar grids based on an exact theoretical projection function to which well identified parameters are added to model real-world errors.

The Epipolar Geometry Toolbox : multiple view geometry and visual servoing for MATLAB

The Epipolar Geometry Toolbox (EGT) was realized to provide a MATLAB user with an extensible framework for the creation and visualization of multi-camera scenarios a nd the manipulation of the visual

Simultaneous localization and mapping: part I

This paper describes the simultaneous localization and mapping (SLAM) problem and the essential methods for solving the SLAM problem and summarizes key implementations and demonstrations of the