Posted in | News | Agricultural Robotics

New Crop Monitoring System Offers Precise Trait Analysis

Yan Zhu and Weixing Cao’s team at Nanjing Agricultural University published a study in Plant Phenomics on March 20th, 2025, which marks a significant breakthrough toward scalable, precise crop trait monitoring in real-world agricultural settings.

Three-dimensional diagram illustrating the structure of the phenotyping robot
Three-dimensional diagram illustrating the structure of the phenotyping robot. (a) Three-dimensional rendering depicting the phenotyping robot. (b). Exploded view showcasing the chassis of the phenotyping robot. (c). Three-dimensional rendering portraying the chassis structure of the phenotyping robot. (d). Overhead view of a rendered image illustrating the wheelbase adjustment mechanism within the chassis of the phenotyping robot. Image Credit: The authors

The system, which includes an adjustable wheel track, a precision-controlled sensor gimbal, and innovative multisensor fusion algorithms, allows for more efficient and precise plant phenotyping, setting the framework for advancements in crop development and sustainable agriculture.

Crop genetic improvement is critical to solving global food concerns, but success requires closing the gap between genomes and visible features. Plant phenomics, or the large-scale study of plant features, offers the connection, but standard methods for evaluating crop structure, physiology, and development are labor-intensive and scale-limited. High-throughput phenotyping (HTP) solutions combine sensors and mobile systems to automate data collection, allowing researchers to examine crops at scale.

Although aerial devices provide vast coverage, they have payload and endurance limitations. Ground-based robots provide precision and variety, but inflexible chassis designs and limited sensor adaptation frequently constrain them. Creating a robust phenotyping robot capable of adapting to changing agricultural circumstances and integrating multi-source data remains a significant issue.

Researchers conducted experiments at the National Engineering and Technology Center for Information Agriculture in Rugao, Jiangsu Province, to assess the effectiveness of a recently designed phenotyping robot. The initial step comprised testing the robot's chassis and gimbal with a GNSS-RTK navigation system that tracked speed, trajectory, and chassis posture.

Simulation with Adams software predicted crucial parameters such as maximum climbing angle, tipping limits, and obstacle traversal height. Subsequent field tests in dryland and paddy situations validated these findings, confirming the chassis’s dependability and versatility.

Tested 50 times, the adjustable wheel track mechanism demonstrated precise closed-loop feedback and an adjustment speed of 19.8 mm/s, demonstrating its ability to adapt to varying row spacings. Equipped with three servo motors and a PID control algorithm, the gimbal demonstrated rapid and reliable orientation control for mounted sensors by achieving precise angle changes across pitch, roll, and yaw with reaction times under one second.

The second stage evaluated multisensor registration and fusion by installing multispectral, thermal infrared, and depth cameras on the robot and comparing the results with handheld instruments across wheat plots of different types, planting densities, and nitrogen levels.

Calibration techniques were used for each sensor type to ensure accuracy. Data were acquired seven times throughout important wheat development phases, and pixel-level fusion with Zhang’s calibration and BRISK algorithms yielded image registration errors of less than three pixels.

Correlation analysis showed significant agreement between robot-mounted and handheld data, with R² values above 0.98 for spectral reflectance, 0.90 for canopy distance, and 0.99 for temperature readings. Bland-Altman plots demonstrated consistency across all parameters. These findings support the robot's capacity to gather high-throughput phenotypic data accurately, reliably, and efficiently in various agricultural fields.

By allowing for adaptive adaptation to diverse crops and settings, the system gives researchers and breeders tremendous tools for accelerating the finding of genes associated with yield, stress resistance, and quality.

Beyond breeding programs, the robot might be used for other field tasks, including fertilization, spraying, and weeding, increasing its use in sustainable agriculture. Pixel-level data fusion also allows for more accurate prediction models in yield estimation and crop stress detection, bridging the gap between laboratory research and field applications.

The study was financed by the National Key Research and Development Program of China (Grant No. 2021YFD2000101).

Journal Reference:

Su, M., et al. (2025) Design and implementation of a high-throughput field phenotyping robot for acquiring multisensor data in wheat. Plant Phenomics. doi.org/10.1016/j.plaphe.2025.100014.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

Sign in to keep reading

We're committed to providing free access to quality science. By registering and providing insight into your preferences you're joining a community of over 1m science interested individuals and help us to provide you with insightful content whilst keeping our service free.

or

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.