Resumen
Early detection and accurately rating the level of plant diseases plays an important role in protecting crop quality and yield. The traditional method of mummy berry disease (causal agent: Monilinia vaccinii-corymbosi) identification is mainly based on field surveys by crop protection experts and experienced blueberry growers. Deep learning models could be a more effective approach, but their performance is highly dependent on the volume and quality of labeled data used for training so that the variance in visual symptoms can be incorporated into a model. However, the available dataset for mummy berry disease detection does not contain enough images collected and labeled from a real-field environment essential for making highly accurate models. Complex visual characteristics of lesions due to overlapping and occlusion of plant parts also pose a big challenge to the accurate estimation of disease severity. This may become a bigger issue when spatial variation is introduced by using sampling images derived from different angles and distances. In this paper, we first present the ?cut-and-paste? method for synthetically augmenting the available dataset by generating additional annotated training images. Then, a deep learning-based object recognition model Yolov5s-CA was used, which integrates the Coordinated Attention (CA) module on the Yolov5s backbone to effectively discriminate useful features by capturing channel and location information. Finally, the loss function GIoU_loss was replaced by CIoU_loss to improve the bounding box regression and localization performance of the network model. The original Yolov5s and the improved Yolov5s-CA network models were trained on real, synthetic, and combined mixed datasets. The experimental results not only showed that the performance of Yolov5s-CA network model trained on a mixed dataset outperforms the baseline model trained with only real field images, but also demonstrated that the improved model can solve the practical problem of diseased plant part detection in various spatial scales with possible overlapping and occlusion by an overall precision of 96.30%. Therefore, our model is a useful tool for the estimation of mummy berry disease severity in a real field environment.