Resumen
The smart farm is currently a hot topic in the agricultural industry. Due to the complex field environment, the intelligent monitoring model applicable to this environment requires high hardware performance, and there are difficulties in realizing real-time detection of ripe strawberries on a small automatic picking robot, etc. This research proposes a real-time multistage strawberry detection algorithm YOLOv5-ASFF based on improved YOLOv5. Through the introduction of the ASFF (adaptive spatial feature fusion) module into YOLOv5, the network can adaptively learn the fused spatial weights of strawberry feature maps at each scale as a way to fully obtain the image feature information of strawberries. To verify the superiority and availability of YOLOv5-ASFF, a strawberry dataset containing a variety of complex scenarios, including leaf shading, overlapping fruit, and dense fruit, was constructed in this experiment. The method achieved 91.86% and 88.03% for mAP and F1, respectively, and 98.77% for AP of mature-stage strawberries, showing strong robustness and generalization ability, better than SSD, YOLOv3, YOLOv4, and YOLOv5s. The YOLOv5-ASFF algorithm can overcome the influence of complex field environments and improve the detection of strawberries under dense distribution and shading conditions, and the method can provide technical support for monitoring yield estimation and harvest planning in intelligent strawberry field management.