How do you teach a robot to get into vehicle and drive it? Three University of Delaware professors plan to figure it out by the end of next year.
Christopher Rasmussen, Ioannis Poulakakis and Herbert Tanner are part of a team competing in a new U.S. Defense Advanced Research Projects Agency (DARPA) challenge. The team, with members from 10 schools, led by Drexel University, is one of several groups worldwide working to advance robotics technology for disaster relief.
Groups from academia, industry and the private sector will attempt to design and deploy a robot capable of disaster response in radioactive- or bio-contaminated areas. The robot must drive vehicles, navigate environments, use tools and manipulate equipment.
UD’s participants researchers will focus on the first steps of the sequence. Mechanical engineering’s Poulakakis and Tanner will program the robot to get in and out of a vehicle, a task that might sound simple but is actually very involved. Meanwhile, Computer and Information Sciences’ Rasmussen, will teach the robot to drive the vehicle factoring in dicey conditions including obstacles, uneven terrain, weather and poor visibility, among others. The challenge roughly mimics the situation created inside a Japanese nuclear energy plant after the 2011 earthquake and tsunami.
“If you don’t want a person to be in danger you send in a robot,” Rasmussen said. “It’s convenient to have one robot that can do the whole thing.”
The robot must mount, drive and dismount a vehicle; travel across rubble; remove debris; open a door, ladder climb; use a tool to break through a concrete wall, locate and shutoff a leaky valve; and remove and replace a pump. Researchers from each of the partner schools will tackle specific aspects of the challenge.
UD joins Drexel, Columbia University, Georgia Institute of Technology, Indiana University, the Korean Advanced Institute of Science and Technology, Ohio State University, Purdue University, Swarthmore College and Worcester Polytechnic Institute in the challenge. All teams will use similar robots, humanoid models with moving joints that vaguely resemble a small adult.
The Drexel-led group is one of seven selected for Track A funding by DARPA and must develop their own robot and operation software.
Teams will work toward reaching goals by two target points during the 27-month competition that begins this month. Phase 1 will last 15 months and culminate with a competitive challenge testing the robots’ ability to complete the eight events. DARPA will then select teams to continue into Phase 2 for another head-to-head competition 12 months later.
About the researchers
Ioannis Poulakakis joined UD in 2010 as an assistant professor in the Department of Mechanical Engineering. His expertise lies in formal control synthesis for hopping robot models and on intuitive control design for quadrupedal running machines. Poulakakis believes that legs have the potential to extend the mobility of robots, enabling them to become useful in real-world situations, such as search and rescue. He is currently working to develop a family of systematic control strategies that work together with the robot’s natural dynamics to generate fast, reliable and efficient running motions.
Christopher Rasmussen, associate professor of Computer and Information Sciences at UD, is proficient in computer vision, mobile robotics and artificial intelligence. He received a 2006 National Science Foundation Career award to develop algorithms that enable robots to find and follow difficult trails, even in rugged terrain. His primary research has focused on computer vision for field robotics, enabling robots to “see” in outdoor environments.
Herbert Tanner joined UD in 2008 as an assistant professor of mechanical engineering. Tanner’s research expertise lies in the area of multi-robot coordination, motion planning and control, mobile sensor networks, and hybrid and cyber-physical system modeling, analysis and design. He is a 2005 recipient of the National Science Foundation’s Career award in robotics. Recent work done by Tanner involves applying formal language theory and linguistics algorithms to design robots that can “think for themselves.”