Resumen
Autonomous underwater vehicles (AUVs) are widely used, but it is a tough challenge to guarantee the underwater location accuracy of AUVs. In this paper, a novel method is proposed to improve the accuracy of vision-based localization systems in feature-poor underwater environments. The traditional stereo visual simultaneous localization and mapping (SLAM) algorithm, which relies on the detection of tracking features, is used to estimate the position of the camera and establish a map of the environment. However, it is hard to find enough reliable point features in underwater environments and thus the performance of the algorithm is reduced. A stereo point and line SLAM (PL-SLAM) algorithm for localization, which utilizes point and line information simultaneously, was investigated in this study to resolve the problem. Experiments with an AR-marker (Augmented Reality-marker) were carried out to validate the accuracy and effect of the investigated algorithm.