Visual odometry on the Mars exploration rovers - a tool to ensure accurate driving and science imaging

@article{Cheng2006VisualOO,
  title={Visual odometry on the Mars exploration rovers - a tool to ensure accurate driving and science imaging},
  author={Yang Cheng and Mark W. Maimone and Larry H. Matthies},
  journal={IEEE Robotics \& Automation Magazine},
  year={2006},
  volume={13},
  pages={54-62}
}
In this paper, visual odometry is presented as an approach to position estimation to find features in a stereo image pair and track them from one frame to the next. Visual odometry has been a highly effective tool for maintaining vehicle safety while driving near obstacles on slopes, achieving difficult drive approaches in fewer sols, and ensuring accurate science imaging. Although it requires active pointing by human drivers in feature-poor terrain, the improved position knowledge enables more… 
Experiments on Stereo Visual Odometry in Feature-Less Volcanic Fields
TLDR
A stereo visual odometry system for volcanic fields which lack visual features on the ground and several key techniques are presented including a framework for terrain adaptive feature detection and a motion estimation method using fewer feature points.
An Experimental Comparison of ROS-compatible Stereo Visual SLAM Methods for Planetary Rovers
TLDR
The implementation of most of the available open source Visual Stereo SLAM on a nVidia Jetson TX2 platform highlighting critical information such as pose estimation accuracy, capability of loop closure, CPU and memory usage is covered.
Experimental study on using visual odometry for navigation in outdoor GPS-denied environments
  • M. Sharifi, Xiaoqi Chen, C. Pretty
  • Computer Science
    2016 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA)
  • 2016
TLDR
This paper presents an experimental study on utilizing visual odometry (VO) for pose estimation of mobile robots in outdoor GPS-denied environments using two different stereo VO algorithms implemented in Robot Operating System (ROS).
Mars Rover Localization based on Feature Matching between Ground and Orbital Imagery
Mars rover localization is usually realized with data from odometers, inertial measurement units, and stereo cameras. Location errors accumulate inevitably during any long-range rover traverse when
Visual odometry based on the Fourier transform using a monocular ground-facing camera
TLDR
The visual odometry approach is based on the Fourier transform, which extracts the translation between consecutive image’s regions captured using a ground-facing camera, which is resistant to wheel slippage and independent of the kinematics of the vehicle.
Flight Results of Vision-Based Navigation for Autonomous Spacecraft Inspection of Unknown Objects
This paper describes a vision-based relative navigation and control strategy for inspecting an unknown, noncooperative, and possibly spinning object in space using a visual–inertial system that is
STEREO VISUAL SYSTEM FOR AUTONOMOUS AIR VEHICLE NAVIGATION
TLDR
A system to estimate the altitude and motion of an aerial vehicle using a stereo visual system and comparing and discussing the results regarding the trajectories calculated by the visual odometry and the onboard helicopter state estimation.
Visual odometry based on the Fourier-Mellin transform for a rover using a monocular ground-facing camera
TLDR
This paper presents a visual odometry method that estimates the location and orientation of a robotic rover platform based on Fourier-Mellin transforms and phase-only matched niters which is well suited for environments exhibiting few features and illumination changes.
Combining Stereo Imaging, Inertial and Altitude Sensing Systems for the Quad-Rotor
This chapter is devoted to the design and implementation of a stereo-vision, inertial and altitude sensing system for a quad-rotor. The objective is to enable the vehicle to autonomously perform
A machine vision based autonomous navigation system for Lunar rover: the model and key technique
Purpose This study aims to find a feasible precise navigation model for the planed Lunar rover. Autonomous navigation is one of the most important missions in the Chinese Lunar exploration
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 16 REFERENCES
Omnidirectional visual odometry for a planetary rover
TLDR
Two methods of online visual odometry suited for planetary rovers are presented and compared, one based on robust estimation of optical flow and subsequent integration of the flow and the other a full structure-from-motion solution.
Path following using visual odometry for a Mars rover in high-slip environments
A system for autonomous operation of Mars rovers in high slip environments has been designed, implemented, and tested. This system is composed of several key technologies that enable the rover to
Rover Self Localization in Planetary-Like Environments
The ability for a rover to localize itself with respect to its environment is a crucial issue to tackle autonomous long range navigation. In this paper, we first present and classify the various kind
Tradeoffs Between Directed and Autonomous Driving on the Mars Exploration Rovers
TLDR
The strategies adopted for selecting between human-planned directed drives versus roveradaptive Autonomous Navigation and Visual Odometry drives are described.
Mars exploration rover surface operations: driving opportunity at Meridiani Planum
TLDR
The experience of driving Opportunity through this alien landscape during its first 400 days on Mars is detailed, from the point of view of the other rover planners, the people who tell the rover where to drive and how to use its robotic arm.
Attitude and position estimation on the Mars exploration rovers
TLDR
The techniques used by the rovers to acquire and maintain attitude and position knowledge, the accuracy which is obtainable, and lessons learned after more than one year in operation are described.
Rover navigation using stereo ego-motion
TLDR
A methodology for long-distance rover navigation that meets both a high level of robustness and a low rate of error growth using robust estimation of ego-motion is described and implemented to run on-board a prototype Mars rover.
Mars Exploration Rover engineering cameras
TLDR
NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras onto the surface of Mars in early 2004, 14 of which are designated as engineering cameras and will support the operation of the vehicles on the Martian surface.
Error Modelling in Stereo Navigation
TLDR
Simulations show that compared to scalar error models the 3-D gaussian reduces the variance in robot position estimates and better distinguishes rotational from translational motion in stereo vision.
Error modeling in stereo navigation
TLDR
Simulations show that, compared to scalar error models, the 3D Gaussian reduces the variance in robot position estimates and better distinguishes rotational from translational motion.
...
1
2
...