RaD-VIO: Rangefinder-aided Downward Visual-Inertial Odometry
@article{Fu2018RaDVIORD, title={RaD-VIO: Rangefinder-aided Downward Visual-Inertial Odometry}, author={Bo Fu and Kumar Shaurya Shankar and Nathan Michael}, journal={2019 International Conference on Robotics and Automation (ICRA)}, year={2018}, pages={1841-1847} }
State-of-the-art forward facing monocular visual-inertial odometry algorithms are often brittle in practice, especially whilst dealing with initialisation and motion in directions that render the state unobservable. In such cases having a reliable complementary odometry algorithm enables robust and resilient flight. Using the common local planarity assumption, we present a fast, dense, and direct frame-to-frame visual-inertial odometry algorithm for downward facing cameras that minimises a…
Figures and Tables from this paper
3 Citations
RP-VIO: Robust Plane-based Visual-Inertial Odometry for Dynamic Environments
- Computer Science2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
- 2021
RP-VIO is presented, a monocular visual-inertial odometry system that leverages the simple geometry of planes for improved robustness and accuracy in challenging dynamic environments and shows in simulation an improvement over a simple dynamic-features masking approach.
Metric scale and angle estimation in monocular visual odometry with multiple distance sensors
- Computer ScienceDigit. Signal Process.
- 2021
Unsupervised deep learning based ego motion estimation with a downward facing camera
- Computer ScienceThe Visual Computer
- 2021
This work proposes a novel approach based on deep learning for estimating ego motion with a downward looking camera that can be trained completely unsupervised and is not restricted to a specific motion model.
References
SHOWING 1-10 OF 16 REFERENCES
Robust direct visual odometry using mutual information
- Computer Science2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)
- 2016
This work presents and experimentally validate a Mutual Information based dense rigid body tracking algorithm that is demonstrably robust to drastic illumination changes, and compares the performance of this algorithm to a canonical Sum of Squared Differences based Lucas-Kanade tracking formulation.
A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots
- Computer Science2018 IEEE International Conference on Robotics and Automation (ICRA)
- 2018
This paper evaluates an array of publicly-available VIO pipelines on different hardware configurations, including several single-board computer systems that are typically found on flying robots, and considers the pose estimation accuracy, per-frame processing time, and CPU and memory load while processing the EuRoC datasets.
VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator
- Computer ScienceIEEE Transactions on Robotics
- 2018
This paper presents VINS-Mono: a robust and versatile monocular visual-inertial state estimator that is applicable for different applications that require high accuracy in localization and performs an onboard closed-loop autonomous flight on the microaerial-vehicle platform.
Nonlinear ego-motion estimation from optical flow for online control of a quadrotor UAV
- Computer ScienceInt. J. Robotics Res.
- 2015
This paper proposes a robust ego-motion estimation algorithm for recovering the UAV scaled linear velocity and angular velocity from optical flow by exploiting the so-called continuous homography constraint in the presence of planar scenes and shows that the nonlinear scheme yields considerably superior performance in terms of convergence rate and predictability of the estimation.
Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments
- Computer Science2012 IEEE International Conference on Robotics and Automation
- 2012
This paper proposes a navigation algorithm for MAVs equipped with a single camera and an Inertial Measurement Unit (IMU) which is able to run onboard and in real-time, and proposes a speed-estimation module which converts the camera into a metric body-speed sensor using IMU data within an EKF framework.
On-board velocity estimation and closed-loop control of a quadrotor UAV based on optical flow
- Computer Science2012 IEEE International Conference on Robotics and Automation
- 2012
This work devised two variants of the classical continuous 4-point algorithm and provided an extensive experimental evaluation against a known ground truth to recover ego-motion from optical flow based on the continuous homography constraint.
Visual SLAM for Flying Vehicles
- Computer ScienceIEEE Transactions on Robotics
- 2008
The technique uses visual features and estimates the correspondences between features using a variant of the progressive sample consensus (PROSAC) algorithm to extract spatial constraints between camera poses that can be used to address the simultaneous localization and mapping (SLAM) problem by applying graph methods.
On measuring the accuracy of SLAM algorithms
- Computer ScienceAuton. Robots
- 2009
A framework for analyzing the results of a SLAM approach based on a metric for measuring the error of the corrected trajectory is proposed, which overcomes serious shortcomings of approaches using a global reference frame to compute the error.
AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles
- Computer ScienceFSR
- 2017
A new simulator built on Unreal Engine that offers physically and visually realistic simulations for autonomous vehicles in real world and that is designed from the ground up to be extensible to accommodate new types of vehicles, hardware platforms and software protocols.
An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications
- Computer Science2013 IEEE International Conference on Robotics and Automation
- 2013
An open source and open hardware design of an optical flow sensor based on a machine vision CMOS image sensor for indoor and outdoor applications with very high light sensitivity and shown in-flight on a micro air vehicle.