The system, which includes an adjustable wheel track, a precision-controlled sensor gimbal, and innovative multisensor fusion algorithms, allows for more efficient and precise plant phenotyping, setting the framework for advancements in crop development and sustainable agriculture.
Crop genetic improvement is critical to solving global food concerns, but success requires closing the gap between genomes and visible features. Plant phenomics, or the large-scale study of plant features, offers the connection, but standard methods for evaluating crop structure, physiology, and development are labor-intensive and scale-limited. High-throughput phenotyping (HTP) solutions combine sensors and mobile systems to automate data collection, allowing researchers to examine crops at scale.
Although aerial devices provide vast coverage, they have payload and endurance limitations. Ground-based robots provide precision and variety, but inflexible chassis designs and limited sensor adaptation frequently constrain them. Creating a robust phenotyping robot capable of adapting to changing agricultural circumstances and integrating multi-source data remains a significant issue.
Researchers conducted experiments at the National Engineering and Technology Center for Information Agriculture in Rugao, Jiangsu Province, to assess the effectiveness of a recently designed phenotyping robot. The initial step comprised testing the robot's chassis and gimbal with a GNSS-RTK navigation system that tracked speed, trajectory, and chassis posture.
Simulation with Adams software predicted crucial parameters such as maximum climbing angle, tipping limits, and obstacle traversal height. Subsequent field tests in dryland and paddy situations validated these findings, confirming the chassis’s dependability and versatility.
Tested 50 times, the adjustable wheel track mechanism demonstrated precise closed-loop feedback and an adjustment speed of 19.8 mm/s, demonstrating its ability to adapt to varying row spacings. Equipped with three servo motors and a PID control algorithm, the gimbal demonstrated rapid and reliable orientation control for mounted sensors by achieving precise angle changes across pitch, roll, and yaw with reaction times under one second.
The second stage evaluated multisensor registration and fusion by installing multispectral, thermal infrared, and depth cameras on the robot and comparing the results with handheld instruments across wheat plots of different types, planting densities, and nitrogen levels.
Calibration techniques were used for each sensor type to ensure accuracy. Data were acquired seven times throughout important wheat development phases, and pixel-level fusion with Zhang’s calibration and BRISK algorithms yielded image registration errors of less than three pixels.
Correlation analysis showed significant agreement between robot-mounted and handheld data, with R² values above 0.98 for spectral reflectance, 0.90 for canopy distance, and 0.99 for temperature readings. Bland-Altman plots demonstrated consistency across all parameters. These findings support the robot's capacity to gather high-throughput phenotypic data accurately, reliably, and efficiently in various agricultural fields.
By allowing for adaptive adaptation to diverse crops and settings, the system gives researchers and breeders tremendous tools for accelerating the finding of genes associated with yield, stress resistance, and quality.
Beyond breeding programs, the robot might be used for other field tasks, including fertilization, spraying, and weeding, increasing its use in sustainable agriculture. Pixel-level data fusion also allows for more accurate prediction models in yield estimation and crop stress detection, bridging the gap between laboratory research and field applications.
The study was financed by the National Key Research and Development Program of China (Grant No. 2021YFD2000101).
Journal Reference:
Su, M., et al. (2025) Design and implementation of a high-throughput field phenotyping robot for acquiring multisensor data in wheat. Plant Phenomics. doi.org/10.1016/j.plaphe.2025.100014.