Extrinsic Calibration of a 3D Laser Scanner and an Omnidirectional Camera

  title={Extrinsic Calibration of a 3D Laser Scanner and an Omnidirectional Camera},
  author={Gaurav Pandey and James R. McBride and Silvio Savarese and Ryan M. Eustice},
  journal={IFAC Proceedings Volumes},
Abstract We propose an approach for external calibration of a 3D laser scanner with an omnidirectional camera system. [] Key Method The extrinsic calibration technique used here is similar to the calibration of a 2D laser range finder and a single camera as proposed by Zhang (2004), but has been extended to the case where we have a 3D laser scanner and an omnidirectional camera system.

Automatic extrinsic calibration between a camera and a 3D Lidar using 3D point and plane correspondences

An automated method to obtain the extrinsic calibration parameters between a camera and a 3D lidar with as low as 16 beams using a checkerboard as a reference and a genetic algorithm to address the highly non-linear state space is proposed.

A new algorithm for the extrinsic calibration of a 2D LIDAR and a camera

A new extrinsic calibration algorithm for a camera and a 2D LIght Detection And Ranging sensor (LIDAR) that gives a new minimal solution, which can be used as the hypothesis generator in the RANdom SAmple Consensus (RANSAC) algorithm.

Automatic Extrinsic Calibration of a Camera and a 3D LiDAR Using Line and Plane Correspondences

  • Lipu ZhouZimo LiM. Kaess
  • Computer Science
    2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
  • 2018
This paper presents an algorithm to estimate the similarity transformation between the LiDAR and the camera for the applications where only the correspondences between laser points and pixels are concerned, and proves that parallel planar targets with parallel boundaries provide the same constraints in the algorithm.

Sensor Fusion of a 2D Laser Scanner and a Thermal Camera

A special triangular calibration target, covering all six degrees of freedom and being visible for both sensors, was developed and allows for assigning every laser measurement within the field of view a corresponding thermal pixel.

Calibration between Color Camera and 3D LIDAR Instruments with a Polygonal Planar Board

This paper is interested in calibrating a low resolution 3D LIDAR with a relatively small number of vertical sensors, and employs a new methodology for the calibration board, which exploits 2D-3D correspondences.

Extrinsic Calibration of a 3D-LIDAR and a Camera

This work presents an extrinsic parameter estimation algorithm between a 3D LIDAR and a Projective Camera using a marker-less planar target, by exploiting Planar Surface Point to Plane and Planar

Photometric laser scanner to camera calibration for low resolution sensors

This work presents a reconstruction-free and vision-based extrinsic calibration algorithm with distinct features, which was developed with special focus on low-density laser scanners and high measurement noise.

Extrinsic calibration of a camera-LIDAR multi sensor system using a planar chessboard

  • Eung-Su KimSoon-Yong Park
  • Environmental Science
    2019 Eleventh International Conference on Ubiquitous and Future Networks (ICUFN)
  • 2019
A simple mutual rotation-and-translation estimation method for a multi-sensor system containing six omnidirectional RGB cameras and a common 3D Light Detection and Ranging sensor using a planar chessboard pattern is proposed.



An efficient algorithm for extrinsic calibration between a 3D laser range finder and a stereo camera for surveillance

A new and efficient method to perform the extrinsic calibration between a 3D LRF and a stereo camera with the aid of inertial data, using an Inertial Measurement Unit (IMU), which decreases the number of points needed to a robust calibration.

Extrinsic self calibration of a camera and a 3D laser range finder from natural scenes

A new approach for the extrinsic calibration of a camera with a 3D laser range finder, that can be done on the fly and brings 3D computer vision systems out of the laboratory and into practical use.

Extrinsic calibration of a camera and laser range finder using a new calibration structure of a plane with a triangular hole

A calibration structure that has triangular hole on its plane is proposed for the extrinsic calibration of a camera and laser range finder and it is found using a conventional 3D-3D transformation computing algorithm.

Extrinsic Auto-calibration of a Camera and Laser Range Finder

The mathematical constraints for autocalibration techniques based upon both discrete and differential motions are introduced, and simulated experimental results, and results from a implementation on a B21r TM Mobile Robot from iRobot Corporation are presented.

Extrinsic calibration of a camera and laser range finder (improves camera calibration)

  • Qilong ZhangRobert Pless
  • Physics
    2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566)
  • 2004
A direct solution is given that minimizes an algebraic error from this constraint, and subsequent nonlinear refinement minimizes a re-projection error, which is the first published calibration tool for this problem.

Fast Extrinsic Calibration of a Laser Rangefinder to a Camera

The usage of the Laser-Camera Calibration Toolbox (LCCT), a Matlab R -based graphical user interface that is meant to accompany this document and facilitates the calibration procedure, is described.

Calibration between a central catadioptric camera and a laser range finder for robotic applications

  • Christopher MeiP. Rives
  • Physics
    Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006.
  • 2006
This paper presents several methods for estimating the relative position of a central catadioptric camera (including perspective cameras) and a laser range finder in order to obtain depth information

A Flexible New Technique for Camera Calibration

A flexible technique to easily calibrate a camera that only requires the camera to observe a planar pattern shown at a few (at least two) different orientations is proposed and advances 3D computer vision one more step from laboratory environments to real world use.

Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography

New results are derived on the minimum number of landmarks needed to obtain a solution, and algorithms are presented for computing these minimum-landmark solutions in closed form that provide the basis for an automatic system that can solve the Location Determination Problem under difficult viewing.

Propagating Covariance in Computer Vision

  • R. Haralick
  • Computer Science
    Theoretical Foundations of Computer Vision
  • 1998
This paper describes how to propagate approximately additive random perturbations through any kind of vision algorithm step in which the appropriate random perturbation model for the estimated