The aim of this paper is to present a new INS/GPS nonlinear UAV flights is also demonstrated.
This paper aims at proposing a framework for Airborne Cooperative Visual Simultaneous Localization and Mapping (C-VSLAM). The use of cooperative vehicles presents many advantages over single-vehicle architecture. We present a nonlinear H<inf>∞</inf> filtering scheme adapted to multiple Unmanned Aerial Vehicle (UAV) VSLAM based on the extension of a… (More)
Although nonlinear H∞ (NH∞) filters offer good performance without requiring assumptions concerning the characteristics of process and/or measurement noises, they still require additional tuning parameters that remain fixed and that need to be determined through trial and error. To address issues associated with NH∞ filters, a new SINS/GPS sensor fusion… (More)
This paper addresses 3D texture mapping in Visual Simultaneous Localization And Mapping (VSLAM) for Unmanned Aerial Vehicle (UAV) applications. Landmark selection strategy based on feature detection methods such as Scale Invariant Feature Transform (SIFT) and Speed Up Robust Features (SURF) is adopted. The selected features are combined with additionally… (More)
The work presented in this paper is a part of research work on autonomous navigation for Micro Aerial Vehicles (MAVs). Simultaneous Localization and Mapping (SLAM) is crucial for any task of MAV navigation. The limited payload of the MAV makes the single camera as best solution for SLAM problem. In this paper the Large Scale Dense SLAM (LSD-SLAM) pose is… (More)
This paper presents an in-depth evaluation of filter algorithms utilized in the estimation of 3D position and attitude for UAV using stereo vision based Visual SLAM integrated with feature detection and matching techniques i.e., SIFT and SURF. The evaluation's aim was to investigate the accuracy and robustness of the filters' estimation for vision based… (More)
This paper aims to present a Decentralized Cooperative Simultaneous Localization and Mapping (DC-SLAM) solution based on a laser telemeter using a Covariance Intersection (CI). The CI will run in the UGVs receiving features to estimate the position and covariance of shared features before adding them to the global map. With the proposed solution, a group of… (More)