Redirigiendo al acceso original de articulo en 16 segundos...
Inicio  /  Aerospace  /  Vol: 9 Par: 12 (2022)  /  Artículo
ARTÍCULO
TITULO

An Integrated UWB-IMU-Vision Framework for Autonomous Approaching and Landing of UAVs

Xin Dong    
Yuzhe Gao    
Jinglong Guo    
Shiyu Zuo    
Jinwu Xiang    
Daochun Li and Zhan Tu    

Resumen

Unmanned Aerial Vehicles (UAVs) autonomous approaching and landing on mobile platforms always play an important role in various application scenarios. Such a complicated autonomous task requires an integrated multi-sensor system to guarantee environmental adaptability in contrast to using each sensor individually. Multi-sensor fusion perception demonstrates great feasibility to compensate for adverse visual events, undesired vibrations of inertia sensors, and satellite positioning loss. In this paper, a UAV autonomous landing scheme based on multi-sensor fusion is proposed. In particular, Ultra Wide-Band (UWB) sensor, Inertial Measurement Unit (IMU), and vision feedback are integrated to guide the UAV to approach and land on a moving object. In the approaching stage, a UWB-IMU-based sensor fusion algorithm is proposed to provide relative position estimation of vehicles with real time and high consistency. Such a sensor integration addresses the open challenge of inaccurate satellite positioning when the UAV is near the ground. It can also be extended to satellite-denied environmental applications. When the landing platform is detected by the onboard camera, the UAV performs autonomous landing. In the landing stage, the vision sensor is involved. With the visual feedback, a deep-learning-based detector and local pose estimator are enabled when the UAV approaches the landing platform. To validate the feasibility of the proposed autonomous landing scheme, both simulation and real-world experiments in extensive scenes are performed. As a result, the proposed landing scheme can land successfully with adequate accuracy in most common scenarios.