Multi-sensor fusion-based SLAM for underwater robots in large-scale environment
Autonomous Underwater Vehicles (AUVs) are among the best tools that are used to undertake the ocean exploration mission, not only making the exploration of the deepest regions possible but also avoiding risk to human lives. AUVs are generally equipped with its own power source and have the capacity to determine its actions based on inputs from its own sensors and a pre-defined mission plan. The development of AUVs has offered numerous advantages but has also presented new challenges. One of the most significant challenges is the problem of underwater navigation or, more specifically, how to determine the vehicle’s position within an underwater environment so it can take the correct navigation actions to successfully accomplish the mission. Two of the most common sensors used for obstacle detection and navigation of underwater robots are active sonar and underwater camera. Sonar and vision operate on different physical principles and provide completely different types of information. An effective fusion strategy for these two different sensors will be designed. Furthemore we will propose a modified SLAM algorithm based on EKF, which should improve the localization accuracy and data consistency.
Last modified: | 17 August 2017 4.00 p.m. |