Observability-Aware Self-Calibration of Visual and Inertial Sensors for Ego-Motion Estimation

@article{Schneider2019ObservabilityAwareSO,
  title={Observability-Aware Self-Calibration of Visual and Inertial Sensors for Ego-Motion Estimation},
  author={Thomas Schneider and Mingyang Li and C{\'e}sar Cadena and Juan I. Nieto and Roland Y. Siegwart},
  journal={IEEE Sensors Journal},
  year={2019},
  volume={19},
  pages={3846-3860}
}
External effects such as shocks and temperature variations affect the calibration of visual–inertial sensor systems and thus they cannot fully rely on factory calibrations. Re-calibrations performed on short user-collected datasets might yield poor performance since the observability of certain parameters is highly dependent on the motion. In addition, on resource-constrained systems (e.g., mobile phones), full-batch approaches over longer sessions quickly become prohibitively expensive. In… 

Online Self-Calibration for Visual-Inertial Navigation Systems: Models, Analysis and Degeneracy

A complete observability analysis for visual-inertial navigation systems (VINS) with full calibration of sensing parameters, including IMU and camera intrinsics and IMU-camera spatial-temporal extrinsic calibration, along with readout time of rolling shutter (RS) cameras (if used).

Observability-Aware Intrinsic and Extrinsic Calibration of LiDAR-IMU Systems

This paper proposes a novel LiDAR-IMU calibration method within the continuous-time batch-optimization framework, where the intrinsics of both sensors and the spatial-temporal extrinsics between sensors are calibrated without using calibration infrastructure such as tags.

Online IMU Intrinsic Calibration: Is It Necessary?

Observability analysis for visual-inertial navigation systems with four different inertial model variants containing intrinsic parameters that encompass one commonly used IMU model for low-cost inertial sensors theoretically confirms that the IMU intrinsics are observable given fully-excited 6-axis motion.

Observability Analysis and Keyframe-Based Filtering for Visual Inertial Odometry With Full Self-Calibration

Simulation and real data tests validated that it is possible to fully calibrate the cameraIMU system using observations of opportunistic landmarks under diverse motion, and showed that the keyframe-based scheme is an alternative cure for drift in standstill.

Optimization-Based Online Initialization and Calibration of Monocular Visual-Inertial Odometry Considering Spatial-Temporal Constraints

The results show that both the initial states and the spatial-temporal parameters can be well estimated and the method outperforms other contemporary methods used for comparison.

A Versatile Keyframe-Based Structureless Filter for Visual Inertial Odometry

Tests confirm that KSF reliably calibrates sensor parameters when the data contain adequate motion, and consistently estimate motion with accuracy rivaling recent VIO methods.

Observability Analysis of IMU Intrinsic Parameters in Stereo Visual–Inertial Odometry

An analytic observability analysis of the self-calibrated VIO that is a nonlinear time-varying system that inspects the rank of the observability matrix formed by Lie derivatives of the nonlinear system and reveals that the IMU intrinsic parameter is fully observable when all six axes of an IMU are excited.

Information Driven Self-Calibration for Lidar-Inertial Systems

A novel self-calibration framework for lidar-inertial systems to use an informative path planner to find the admissible path that produces the most accurate calibration of such systems in an unknown environment within a given time budget.

Monocular Visual-Inertial and Robotic-Arm Calibration in a Unifying Framework

A monocular visual-inertial and robotic-arm calibration in a unifying framework that achieves consistency, accuracy and effectiveness and decoupled estimations on rotation and translation could reduce the coupled errors during the optimization.

CalQNet - Detection of Calibration Quality for Life-Long Stereo Camera Setups

A novel data-driven method to estimate the quality of extrinsic calibration and detect discrepancies between the original calibration and the current system state for stereo camera systems and shows the framework's ability to predict the divergence of a state-of-the-art stereo-visual odometry system following a degraded calibration in two real-world experiments.

References

SHOWING 1-10 OF 35 REFERENCES

Visual-inertial self-calibration on informative motion segments

This paper presents a novel approach for resource efficient self-calibration of visual-inertial sensor systems by casting the calibration as a segment-based optimization problem that can be run on a small subset of informative segments, so that the computational burden is limited.

Drift-correcting self-calibration for visual-inertial SLAM

Constant-time operation is achieved by selecting only a fixed number of informative segments of the trajectory for calibration parameter estimation, giving the added benefit of avoiding early linearization errors by not rolling past measurements into a prior distribution.

Online temporal calibration for camera–IMU systems: Theory and algorithms

This work proposes an online approach for estimating the time offset between the visual and inertial sensors, and shows that this approach can be employed in pose-tracking with mapped features, in simultaneous localization and mapping, and in visual–inertial odometry.

Non-Parametric Extrinsic and Intrinsic Calibration of Visual-Inertial Sensor Systems

This paper presents a solution for the extrinsic and intrinsic calibration of visual-inertial sensor systems. Calibration is formulated as a joint state and parameter estimation problem of a

Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration

This paper describes an algorithm, based on the unscented Kalman filter, for self-calibration of the transform between a camera and an inertial measurement unit (IMU), which demonstrates accurate estimation of both the calibration parameters and the local scene structure.

Self-supervised calibration for robotic systems

This work presents a generic algorithm for self calibration of robotic systems that utilizes information theoretic measures to automatically identify and store novel measurement sequences and outperforms state-of-the-art algorithms in terms of stability, accuracy, and computational efficiency.

High-fidelity sensor modeling and self-calibration in vision-aided inertial navigation

It is demonstrated, in both simulation tests and real-world experiments, that the proposed approach is able to accurately calibrate all the considered parameters in real time, and leads to significantly improved estimation precision compared to existing approaches.

VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator

This paper presents VINS-Mono: a robust and versatile monocular visual-inertial state estimator that is applicable for different applications that require high accuracy in localization and performs an onboard closed-loop autonomous flight on the microaerial-vehicle platform.

Keyframe-based visual–inertial odometry using nonlinear optimization

This work forms a rigorously probabilistic cost function that combines reprojection errors of landmarks and inertial terms and compares the performance to an implementation of a state-of-the-art stochastic cloning sliding-window filter.

Unified temporal and spatial calibration for multi-sensor systems

A novel framework for jointly estimating the temporal offset between measurements of different sensors and their spatial displacements with respect to each other is presented, enabled by continuous-time batch estimation and extends previous work by seamlessly incorporating time offsets within the rigorous theoretical framework of maximum likelihood estimation.