Resumen
Adaptive navigation is the core of micro aerial vehicles (MAVs) conducting autonomous flights in diverse environments. Different navigation techniques are adopted according to the availability of navigation signals in the environment. MAVs must navigate using scene recognition technology to ensure the continuity and reliability of the flight. Therefore, our work investigated the scene recognition method for MAV environment-adaptive navigation. First, we exploited the functional intelligence-adaptive navigation (FIAN) scheme by imitating the physiological decision-making process. Then, based on sufficient environment-sensitive measurements from the environment perception subsystem in FIAN, the two-level scene recognition method (TSRM) in the decision-making subsystem consisting of two deep learning frameworks, SceneNet and Mobile Net-V2 was proposed to extract scene features for accurate diverse scenes recognition. Furthermore, the four-rotor MAV-Smartphone combined (MSC) platform simulating the owl?s omni-directional head-turning behavior was built. The proposed TSRM was evaluated for accuracy, delay, and robustness compared with PSO-SVM and GIST-SVM. The results of practical flight tests through MSC platform show that TSRM has higher classification accuracy than PSO-SVM and GIST-SVM, and performs smoothly with self-regulatory adaptations under diverse environments.