High-throughput phenotyping is essential for accelerating crop breeding, but collecting accurate, consistent data in open fields remains a technical hurdle.
Many current robotic systems are built on rigid chassis that can't adjust to varying crop row widths, limiting their use in diverse environments. Others struggle with sensor flexibility—either lacking the payload to support multiple tools or being unable to adjust their position and orientation. Data integration is another bottleneck, as most systems are unable to fuse information from multiple imaging sources in real time.
To overcome these challenges, the researchers designed a robot specifically for dynamic field conditions. Its gantry-style chassis features a sliding column mechanism powered by electric push rods, allowing the wheel track to adjust steplessly between 1400 and 1600 mm. This adaptability ensures compatibility with a variety of crop row spacings. High ground clearance helps prevent crop damage during operation, while a cross-row walking mode reduces soil compaction. Steering is based on an Ackermann kinematic model, which supports smooth, stable movement across uneven terrain.
Integrated Design for Precision Sensing and Control
A standout feature of the system is its six-degree-of-freedom sensor gimbal, capable of carrying up to 10 kilograms. Using electric actuators and servo motors, the gimbal provides precise adjustments in height and angle, making it possible to gather consistent, high-resolution data across different sensor types—including RGB, multispectral, and thermal.
To bring this data together, the team developed a novel pixel-level image fusion method. Leveraging Zhang’s calibration and feature point extraction algorithms, they generated a homography matrix that aligns images from different sensors within a shared field of view. This ensures that the robot can collect synchronized, accurately registered data in a single pass, eliminating the need for manual post-alignment and improving analysis efficiency.
The robot is operated through an STM32-based motion controller and a remote Android GUI. An onboard industrial computer handles sensor operation and edge computing tasks, enabling real-time processing and reducing the burden of downstream data handling.
Field Testing and Results
To validate the robot’s performance, the team conducted field trials in both dry and paddy field conditions. The adjustable chassis performed smoothly, with the wheel track adjusting at a speed of 19.8 mm per second. The sensor gimbal was fast and accurate, achieving 30° of angular adjustment in under one second.
The image registration method proved highly accurate, with a root mean square error (RMSE) of no more than 2.458 pixels across 500 samples. Sensor data collected by the robot was then compared with measurements taken using handheld instruments.
Linear regression analysis showed strong agreement for key indicators such as spectral reflectance, canopy distance, and temperature. Bland-Altman plots further confirmed consistency, with most data points falling within the 95 % limits of agreement, demonstrating the system's reliability for high-throughput phenotypic data collection.
A Scalable Solution for Crop Research
By combining mechanical adaptability, precision sensing, and advanced image fusion, the robot offers a robust solution to the practical challenges of field-based phenotyping. It enables researchers to collect high-quality crop data at scale, across different environments, and with minimal manual intervention. This capability is critical not only for accelerating crop breeding but also for ensuring that future agricultural innovations are informed by accurate, field-validated data.
Journal Reference
Su, M., Zhou, D., Yun, Y., Ding, B., Xia, P., Yao, X., Ni, J., Zhu, Y., & Cao, W. (2025). Design and implementation of a high-throughput field phenotyping robot for acquiring multisensor data in wheat. Plant Phenomics, 7(2), 100014. DOI:10.1016/j.plaphe.2025.100014. https://www.sciencedirect.com/science/article/pii/S2643651525000202
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.