A nearly $1 million grant from the U.S. Department of Agriculture (USDA) is helping University of California researchers refine collaborative robotic technology that could change the way crops are maintained worldwide, saving millions of gallons of water each year and taking precision agriculture to a whole new level.
The three-year Robot-Assisted Precision Irrigation and Diagnostics (RAPID) project is led by UC Merced robotics Professor Stefano Carpin, UC Berkeley Professor Ken Goldberg — director of the People and Robots Initiative at the Center for Information Technology Research in the Interest of Society (CITRIS) and the Banatao Institute — and UC Davis biology and engineering Professor Stavros Vougioukas.
Some studies estimate that 85 percent of the world's freshwater is used for agriculture, and existing methods of maintaining crops could be improved. The researchers plan to show they can conserve more water through co-robotics, with workers interacting with technology for precision irrigation, ideally at the individual plant level.
This comes from looking for ways to address drought and agricultural concerns,” said Carpin, with the School of Engineering. “Anything we can do to save water is, and will continue to be, very important.”
For now, the research could make wine drinkers happy, too, as the researchers plan to start with grape vines after talking to growers in the Central Valley and Napa Valley about crop concerns.
The RAPID system includes small, plastic emitters — that cost about 25 cents each — attached to individual irrigation lines and precisely controlled by handheld devices operated by field workers or mounted on mobile robots. The devices signal the emitters to adjust the amount of water each vine receives.
With the handheld devices, Vougioukas said, the researchers will water each grape vine individually, making sure they can deliver the exact amount of water each vine needs to keep it healthy and achieving its best yield. Goldberg said the co-robotics system is far less expensive than more common retrofits to existing irrigation systems
“We want to start with grapes, because the final product so heavily depends on fine adjustments,” Carpin said. “We’ve grown grapes for thousands of years, and we do it very well, but growing grapes is a tricky art and science. “You don’t want too much water in the individual grapes. You need to stress the vines in exactly the right way to get a quality harvest, but not so little that the vines die.”
The USDA grant — a first for Carpin and Goldberg — will support the researchers and their students over the next three years as they optimize RAPID. The project is part of the National Science Foundation’s National Robotics Initiative, the country’s premier robotics program, and involves three of the four UC CITRIS campuses.
Farmers are already using technology such as infrared sensing and drones to monitor fields from the air to focus on individual plants that are receiving too much or too little water.
“One remaining challenge is how to close the loop — how to adjust irrigation based on the aerial image data,” Goldberg said. “This co-robotics system is the next generation of precision agriculture.”
The robots have to be robust and sturdy, Carpin said, and the Central Valley is a good testing area — between the summer heat and the dust, it can be an inhospitable place for sensitive equipment. The tests will determine if the robotics can work as well in different environments as they have in laboratory situations.
The campuses are pursuing a joint patent on the technology, so there are commercial possibilities, Goldberg said. The potential for use in any agricultural application makes it a perfect illustration of the People and Robots Initiative and of CITRIS itself.
”This really is the core of what CITRIS was founded for,” he said. “We want to develop technology that benefits humanity, the economy and the environment.”
See a video by Adriel Olmos/CITRIS and the Banatao Institute that shows how the RAPID system works:
RAPID - Robot Assisted Precision Irrigation Delivery