Resumen
Plant phenotyping studies the complex characteristics of plants, with the aim of evaluating and assessing their condition and finding better exemplars. Recently, a new branch emerged in the phenotyping field, namely, high-throughput phenotyping (HTP). Specifically, HTP exploits modern data sampling techniques to gather a high amount of data that can be used to improve the effectiveness of phenotyping. Hence, HTP combines the knowledge derived from the phenotyping domain with computer science, engineering, and data analysis techniques. In this scenario, machine learning (ML) and deep learning (DL) algorithms have been successfully integrated with noninvasive imaging techniques, playing a key role in automation, standardization, and quantitative data analysis. This study aims to systematically review two main areas of interest for HTP: hardware and software. For each of these areas, two influential factors were identified: for hardware, platforms and sensing equipment were analyzed; for software, the focus was on algorithms and new trends. The study was conducted following the PRISMA protocol, which allowed the refinement of the research on a wide selection of papers by extracting a meaningful dataset of 32 articles of interest. The analysis highlighted the diffusion of ground platforms, which were used in about 47% of reviewed methods, and RGB sensors, mainly due to their competitive costs, high compatibility, and versatility. Furthermore, DL-based algorithms accounted for the larger share (about 69%) of reviewed approaches, mainly due to their effectiveness and the focus posed by the scientific community over the last few years. Future research will focus on improving DL models to better handle hardware-generated data. The final aim is to create integrated, user-friendly, and scalable tools that can be directly deployed and used on the field to improve the overall crop yield.