Redirigiendo al acceso original de articulo en 20 segundos...
Inicio  /  Algorithms  /  Vol: 17 Par: 4 (2024)  /  Artículo
ARTÍCULO
TITULO

Testing a Vision-Based Autonomous Drone Navigation Model in a Forest Environment

Alvin Lee    
Suet-Peng Yong    
Witold Pedrycz and Junzo Watada    

Resumen

Drones play a pivotal role in various industries of Industry 4.0. For achieving the application of drones in a dynamic environment, finding a clear path for their autonomous flight requires more research. This paper addresses the problem of finding a navigation path for an autonomous drone based on visual scene information. A deep learning-based object detection approach can localize obstacles detected in a scene. Considering this approach, we propose a solution framework that includes masking with a color-based segmentation method to identify an empty area where the drone can fly. The scene is described using segmented regions and localization points. The proposed approach can be used to remotely guide drones in dynamic environments that have poor coverage from global positioning systems. The simulation results show that the proposed framework with object detection and the proposed masking technique support drone navigation in a dynamic environment based only on the visual input from the front field of view.

 Artículos similares