Posted in | News | Agricultural Robotics

New Agricultural Robot Enhances Accuracy and Efficiency in Crop Phenotyping

Researchers have developed a field-ready agricultural robot that combines adjustable hardware with advanced multisensor data fusion, delivering a practical, automated solution for collecting crop data in real-world conditions. Published in Plant Phenomics, the study addresses key limitations of existing phenotyping platforms and offers a reliable tool to support genetic crop improvement and food security efforts.

Drone shot of a tractor spraying in lush green wheat fields under the bright sun, showcasing modern agriculture.

Study: Design and implementation of a high-throughput field phenotyping robot for acquiring multisensor data in wheat. Image Credit: oticki/Shutterstock.com

Rethinking Phenotyping for Real Field Conditions

High-throughput phenotyping is essential for accelerating crop breeding, but collecting accurate, consistent data in open fields remains a technical hurdle.

Many current robotic systems are built on rigid chassis that can't adjust to varying crop row widths, limiting their use in diverse environments. Others struggle with sensor flexibility—either lacking the payload to support multiple tools or being unable to adjust their position and orientation. Data integration is another bottleneck, as most systems are unable to fuse information from multiple imaging sources in real time.

To overcome these challenges, the researchers designed a robot specifically for dynamic field conditions. Its gantry-style chassis features a sliding column mechanism powered by electric push rods, allowing the wheel track to adjust steplessly between 1400 and 1600 mm. This adaptability ensures compatibility with a variety of crop row spacings. High ground clearance helps prevent crop damage during operation, while a cross-row walking mode reduces soil compaction. Steering is based on an Ackermann kinematic model, which supports smooth, stable movement across uneven terrain.

Integrated Design for Precision Sensing and Control

A standout feature of the system is its six-degree-of-freedom sensor gimbal, capable of carrying up to 10 kilograms. Using electric actuators and servo motors, the gimbal provides precise adjustments in height and angle, making it possible to gather consistent, high-resolution data across different sensor types—including RGB, multispectral, and thermal.

To bring this data together, the team developed a novel pixel-level image fusion method. Leveraging Zhang’s calibration and feature point extraction algorithms, they generated a homography matrix that aligns images from different sensors within a shared field of view. This ensures that the robot can collect synchronized, accurately registered data in a single pass, eliminating the need for manual post-alignment and improving analysis efficiency.

The robot is operated through an STM32-based motion controller and a remote Android GUI. An onboard industrial computer handles sensor operation and edge computing tasks, enabling real-time processing and reducing the burden of downstream data handling.

Field Testing and Results

To validate the robot’s performance, the team conducted field trials in both dry and paddy field conditions. The adjustable chassis performed smoothly, with the wheel track adjusting at a speed of 19.8 mm per second. The sensor gimbal was fast and accurate, achieving 30° of angular adjustment in under one second.

The image registration method proved highly accurate, with a root mean square error (RMSE) of no more than 2.458 pixels across 500 samples. Sensor data collected by the robot was then compared with measurements taken using handheld instruments.

Linear regression analysis showed strong agreement for key indicators such as spectral reflectance, canopy distance, and temperature. Bland-Altman plots further confirmed consistency, with most data points falling within the 95 % limits of agreement, demonstrating the system's reliability for high-throughput phenotypic data collection.

A Scalable Solution for Crop Research

By combining mechanical adaptability, precision sensing, and advanced image fusion, the robot offers a robust solution to the practical challenges of field-based phenotyping. It enables researchers to collect high-quality crop data at scale, across different environments, and with minimal manual intervention. This capability is critical not only for accelerating crop breeding but also for ensuring that future agricultural innovations are informed by accurate, field-validated data.

Journal Reference

Su, M., Zhou, D., Yun, Y., Ding, B., Xia, P., Yao, X., Ni, J., Zhu, Y., & Cao, W. (2025). Design and implementation of a high-throughput field phenotyping robot for acquiring multisensor data in wheat. Plant Phenomics, 7(2), 100014. DOI:10.1016/j.plaphe.2025.100014. https://www.sciencedirect.com/science/article/pii/S2643651525000202

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Nandi, Soham. (2025, September 11). New Agricultural Robot Enhances Accuracy and Efficiency in Crop Phenotyping. AZoRobotics. Retrieved on September 11, 2025 from https://www.azorobotics.com/News.aspx?newsID=16172.

  • MLA

    Nandi, Soham. "New Agricultural Robot Enhances Accuracy and Efficiency in Crop Phenotyping". AZoRobotics. 11 September 2025. <https://www.azorobotics.com/News.aspx?newsID=16172>.

  • Chicago

    Nandi, Soham. "New Agricultural Robot Enhances Accuracy and Efficiency in Crop Phenotyping". AZoRobotics. https://www.azorobotics.com/News.aspx?newsID=16172. (accessed September 11, 2025).

  • Harvard

    Nandi, Soham. 2025. New Agricultural Robot Enhances Accuracy and Efficiency in Crop Phenotyping. AZoRobotics, viewed 11 September 2025, https://www.azorobotics.com/News.aspx?newsID=16172.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

Sign in to keep reading

We're committed to providing free access to quality science. By registering and providing insight into your preferences you're joining a community of over 1m science interested individuals and help us to provide you with insightful content whilst keeping our service free.

or

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.