Posted in | News | Agricultural Robotics

Novel Camera System with Active Machine Vision Module for Agricultural Purposes

Researchers from the Pennsylvania State University have developed a novel camera system with active lighting as an important stage in the development of machine vision systems that enable robotic devices to keenly “see” the agricultural targets to which they will react.

Novel Camera System with Active Machine Vision Module for Agricultural Purposes.
Doctoral candidate Omeed Mirbod demonstrates how drone imaging would work in an apple orchard to first-year engineering students. This lab photo was taken prior to required indoor masking. Image Credit: Pennsylvania State University

The system employs “over-current driven” LED lights to generate powerful flash with the potential to fire multiple times per second, facilitating the creation of dependable daytime imaging, as reported by Daeun Choi, assistant professor of agricultural and biological engineering in the College of Agricultural Sciences.

The technique addresses the changing lighting and color inconsistencies caused by sunlight and also eliminates motion blur that happens along with vehicle movement and ground vibrations.

In the future, this system or one like it will likely be used to guide mechanisms that independently perform labor-intensive tasks such as pruning apple trees, estimating fruit yield, fruit thinning and mushroom picking. The innovative aspect of this research was that the current drawn by the LED lights was increased by a factor of six times its normal rating, resulting in increased illuminance.

Daeun Choi, Assistant Professor of Agricultural and Biological Engineering, College of Agricultural Sciences, Pennsylvania State University

Choi stated that the study is significant as more farmers are looking forward to adopting technologies in precision agriculture and automation to enhance output and efficiency. With the impact of factors such as global competition, increasing food demand from population growth and consumer expectation of good quality agricultural products, producers are being attracted to machine vision systems with remote sensing devices to facilitate data recording and analysis in agricultural applications.

Omeed Mirbod, a researcher and doctoral student in agricultural and biological engineering, has developed a circuit for storing and releasing energy to the LEDs to generate a strobe-like effect and a controller enabling synchronization of the strobe with a camera to capture images. Mirbod initiated the LED strobe concept while working at Carnegie Mellon University, before coming to Penn State, where xenon flash lamps were being used for daytime imaging.

Artificial intelligence does well with images that are really rich with information, so the important thing is capturing high-quality images. For agriculture, we need images that are invariant to outdoor lighting conditions. If you capture an image in which a fruit is very saturated with light due to the sun, and then capture another one in shadow where there is little sunlight, the artificial intelligence that you're training to detect the fruit might struggle to identify it.

Omeed Mirbod, Researcher and Doctoral Student, Agricultural and Biological Engineering, College of Agricultural Sciences, Pennsylvania State University

The system was applied in an apple orchard for three days in the summer of 2020. Images of several canopy structures were captured throughout the day, both in cloudy and sunny conditions. Usage of LED flashes leads to remarkable enhancement in the brightness and color consistencies of the images.

The study was published in Computers and Electronics in Agriculture.

The researchers affirmed the high quality of images captured by the system. During the 11-hour period, the images displayed an average decrease in standard deviation ranging 85% for hue-saturation-value channels when compared to that of the auto exposure setting. Furthermore, the prototype system was capable of fixing motion blur in machine vision images with the camera moving at a speed of 4 miles per hour.

According to Mirbod, the outcomes revealed that the designed LED flash system is capable of reducing the undesirable effects of lighting changes and motion blur in images forming due to outdoor field conditions. As most of the previous research in this area was only done in indoor conditions, this research was considered to be significant.

In the previous studies, xenon flash lamps were used for daytime imaging, and LEDs were used for nighttime imaging. However, this study gains significance as an evaluation is made regarding the performance of overcurrent-driven LED lights for daytime imaging applications.

When we apply the same technology to agriculture fields, we encounter lots of difficulties. The most challenging thing is weather and varying sunlight conditions. And when we use a regular camera setup, with a lighting system designed for indoor use, we end up getting really terrible images that are difficult to work with.

Omeed Mirbod, Researcher and Doctoral Student, Agricultural and Biological Engineering, College of Agricultural Sciences, Pennsylvania State University

The development of an active LED lighting machine vision approach by the Penn State researchers has the objective of mentoring ag robots that can work on the field 24 hours a day, 7 hours a week.

So, it won’t matter what time they are working or whether there is much sunlight. Because, at the end of the day, we want to have fully automated systems that can work in the field anytime,” Mirbod concluded.

The research team also comprised Roderick Thomas, associate teaching professor of agricultural and biological engineering, and Long He, assistant professor of agricultural and biological engineering.

The study was supported by The State Horticultural Association of Pennsylvania and the U.S. Department of Agriculture’s National Institute of Food and Agriculture.

Source: https://www.psu.edu/

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Submit