Observability-Aware Self-Calibration of Visual and Inertial Sensors for Ego-Motion Estimation

@article{Schneider2019ObservabilityAwareSO,
  title={Observability-Aware Self-Calibration of Visual and Inertial Sensors for Ego-Motion Estimation},
  author={Thomas Schneider and Mingyang Li and C{\'e}sar Cadena and Juan I. Nieto and Roland Y. Siegwart},
  journal={IEEE Sensors Journal},
  year={2019},
  volume={19},
  pages={3846-3860}
}
External effects such as shocks and temperature variations affect the calibration of visual–inertial sensor systems and thus they cannot fully rely on factory calibrations. Re-calibrations performed on short user-collected datasets might yield poor performance since the observability of certain parameters is highly dependent on the motion. In addition, on resource-constrained systems (e.g., mobile phones), full-batch approaches over longer sessions quickly become prohibitively expensive. In… 

Online Self-Calibration for Visual-Inertial Navigation Systems: Models, Analysis and Degeneracy

A complete observability analysis for visual-inertial navigation systems (VINS) with full calibration of sensing parameters, including IMU and camera intrinsics and IMU-camera spatial-temporal extrinsic calibration, along with readout time of rolling shutter (RS) cameras (if used).

Observability-Aware Intrinsic and Extrinsic Calibration of LiDAR-IMU Systems

This article proposes a novel LiDAR-IMU calibration method within the continuous-time batch-optimization framework, where the intrinsics of both sensors and the spatial-temporal extrinsics between sensors are calibrated without using calibration infrastructure, such as fiducial tags.

Online IMU Intrinsic Calibration: Is It Necessary?

Observability analysis for visual-inertial navigation systems with four different inertial model variants containing intrinsic parameters that encompass one commonly used IMU model for low-cost inertial sensors theoretically confirms that the IMU intrinsics are observable given fully-excited 6-axis motion.

Observability Analysis and Keyframe-Based Filtering for Visual Inertial Odometry With Full Self-Calibration

Simulation and real data tests validated that it is possible to fully calibrate the cameraIMU system using observations of opportunistic landmarks under diverse motion, and showed that the keyframe-based scheme is an alternative cure for drift in standstill.

Optimization-Based Online Initialization and Calibration of Monocular Visual-Inertial Odometry Considering Spatial-Temporal Constraints

The results show that both the initial states and the spatial-temporal parameters can be well estimated and the method outperforms other contemporary methods used for comparison.

A Versatile Keyframe-Based Structureless Filter for Visual Inertial Odometry

Tests confirm that KSF reliably calibrates sensor parameters when the data contain adequate motion, and consistently estimate motion with accuracy rivaling recent VIO methods.

Observability-aware online multi-lidar extrinsic calibration

The results presented in this paper show that the approach is able to accurately determine the extrinsic calibration for various combinations of sensor setups and provides stopping criteria for ensuring calibration completion.

Observability Analysis of IMU Intrinsic Parameters in Stereo Visual–Inertial Odometry

An analytic observability analysis of the self-calibrated VIO that is a nonlinear time-varying system that inspects the rank of the observability matrix formed by Lie derivatives of the nonlinear system and reveals that the IMU intrinsic parameter is fully observable when all six axes of an IMU are excited.

Information Driven Self-Calibration for Lidar-Inertial Systems

A novel self-calibration framework for lidar-inertial systems to use an informative path planner to find the admissible path that produces the most accurate calibration of such systems in an unknown environment within a given time budget.

Monocular Visual-Inertial and Robotic-Arm Calibration in a Unifying Framework

A monocular visual-inertial and robotic-arm calibration in a unifying framework that achieves consistency, accuracy and effectiveness and decoupled estimations on rotation and translation could reduce the coupled errors during the optimization.

References

SHOWING 1-10 OF 35 REFERENCES

Visual-inertial self-calibration on informative motion segments

This paper presents a novel approach for resource efficient self-calibration of visual-inertial sensor systems by casting the calibration as a segment-based optimization problem that can be run on a small subset of informative segments, so that the computational burden is limited.

Online temporal calibration for camera–IMU systems: Theory and algorithms

This work proposes an online approach for estimating the time offset between the visual and inertial sensors, and shows that this approach can be employed in pose-tracking with mapped features, in simultaneous localization and mapping, and in visual–inertial odometry.

Non-Parametric Extrinsic and Intrinsic Calibration of Visual-Inertial Sensor Systems

This paper presents a solution for the extrinsic and intrinsic calibration of visual-inertial sensor systems. Calibration is formulated as a joint state and parameter estimation problem of a

Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration

This paper describes an algorithm, based on the unscented Kalman filter, for self-calibration of the transform between a camera and an inertial measurement unit (IMU), which demonstrates accurate estimation of both the calibration parameters and the local scene structure.

Self-supervised calibration for robotic systems

This work presents a generic algorithm for self calibration of robotic systems that utilizes information theoretic measures to automatically identify and store novel measurement sequences and outperforms state-of-the-art algorithms in terms of stability, accuracy, and computational efficiency.

High-fidelity sensor modeling and self-calibration in vision-aided inertial navigation

It is demonstrated, in both simulation tests and real-world experiments, that the proposed approach is able to accurately calibrate all the considered parameters in real time, and leads to significantly improved estimation precision compared to existing approaches.

Unified temporal and spatial calibration for multi-sensor systems

A novel framework for jointly estimating the temporal offset between measurements of different sensors and their spatial displacements with respect to each other is presented, enabled by continuous-time batch estimation and extends previous work by seamlessly incorporating time offsets within the rigorous theoretical framework of maximum likelihood estimation.

Multi-Sensor SLAM with Online Self-Calibration and Change Detection

We present a solution for constant-time self-calibration and change detection of multiple sensor intrinsic and extrinsic calibration parameters without any prior knowledge of the initial system state

Constant-time monocular self-calibration

  • Nima KeivanGabe Sibley
  • Computer Science
    2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014)
  • 2014
An extensible framework for real-time self-calibration of cameras in the simultaneous mapping and localization (SLAM) setting is described, and parameters estimated by the framework are shown to closely match the batch solution as well as offline calibration values, but are computed live in constant-time.

Extending kalibr: Calibrating the extrinsics of multiple IMUs and of individual axes

This work derives a method for spatially calibrating multiple IMUs in a single estimator based on the open-source camera/IMU calibration toolbox kalibr and suggests that the extended estimator is capable of precisely determining IMU intrinsics and even of localizing individual accelerometer axes inside a commercial grade IMU to millimeter precision.