A survey on real-time motion estimation techniques for underwater robots
Simultaneous localisation and mapping (SLAM) has been the focus of intensive research in the last decade due to the potential benefits it offers to the field of autonomous mobile robotics. SLAM is concerned with the ability of an autonomous vehicle to navigate through an unexplored environment and incrementally construct a map of the environment and localise itself within this map. This thesis describes an entirely vision-based, large-area, 6DoF SLAM system that was developed specifically for real-time deployment on an autonomous underwater vehicle (AUV) equipped with a calibrated stereo system. This SLAM system is based on the extended Kalman filter (EKF) and incorporates a novel approach to landmark description and data association in which landmarks are essentially local submaps that consist of a cloud of 3D points and their associated SIFT or SURF descriptors. Furthermore, landmarks are sparsely distributed in the constructed map which greatly simplifies and accelerates data association and map updates. In addition to performing localisation based on landmark observations the system also performs visual odometry and predicts vehicle motion using a constant-velocity model. For a simulated 87m long 3D loop trajectory the mean squared localisation error of the system was 3.16 and the maximum absolute error in roll, pitch and yaw angles was 11.6, 24.3 and 24.4 respectively when the stereo and landmark correspondences contained Gaussian noise with a standard deviation of 0.1 pixels and 10% of correspondences were outliers. This thesis represents an important contribution to entirely vision-based 6DoF SLAM as very few implementations currently exist, and the approach utilised in this thesis achieves comparable results and has the potential to operate in real-time. I have yet to see any problem, however complicated, which, when you looked at it the right way, did not become still more complicated. . . .