Autonomous Robots Programmed by Stanford Students Mimic Self-Driving Cars

A tiny robot, which is about the size of a milk jug, rolls along at a cautious pace, mapping its environment in a miniature city. It then stops, turns, and records more data about its surroundings – a wall here, a fence there – while simultaneously watching for animals in danger or in need of rescue.

Programmed by Stanford University students, the robot navigates an unknown cityscape autonomously and helps in a simulated rescue of animals in danger. The robots as well as their rescue mission are all part of the final demonstration day held for Stanford University students in the Principles of Robotic Autonomy class - a class that imitates the programming required for future autonomous robots or cars.

These robots are small but they contain a representative set of sensors that you would see on a real self-driving car. So, in a way, it is a sort of miniature city where self-driving robots are moving around in a way that is analogous to how a real self-driving car would behave in the world.

Marco Pavone, Instructor for the Course & Assistant Professor of Aeronautics & Astronautics at Stanford

For the demonstration, the full mission of the robots was to map the toy-size city, locate animals along the way, and then report back to an emergency responder headquarters. They would then need to go back to each animal to guide the rescue team.

Laser sensors were integrated in the compact, autonomous robots to help them detect and record obstacles. The robots were also fitted with an on-board computer as well as cameras to enable them to detect animals – in this case denoted by photos of dogs, cats, and an elephant. Using image classification developed through deep learning algorithms and industry-standard software, student teams programmed their robots to operate at different levels of autonomy,

Wide-ranging skills and successes

The class is open to ambitious undergraduates and graduate students, and not just limited to engineering. The class allows students to learn skills that cross multiple disciplines, enables them to explore the math behind the algorithms coded by them, and allows them to learn ways to program those mathematical ideas into their robots.

I knew a little bit about everything before but being able to implement it on a real robot and seeing the challenges that happened – like getting the navigator to play nice with the image detector – has been really fun,” said Laura Matloff, a graduate student in mechanical engineering who enrolled in the course this winter.

During the quarter, the students incrementally developed the various parts of the autonomy software and then worked in groups to combine those components and integrate them into their robots.

All of these components work well on their own but putting them together is what tends to be difficult,” said Benoit Landry, teaching assistant for the class and a graduate student in aeronautics and astronautics. “So that’s really something that we’re trying to emphasize: How do you piece all of these complicated parts together to make a whole that works?”

Beyond convincing a large range of software and hardware to work in harmony, where the students truly demonstrated their creativity was in how their robots handled the unexpected. Matloff’s teams members programmed their robot to search for unmapped areas and at the same time they also made sure that their robot pauses intermittently, gauging whether its onboard processing and understanding of the globe had caught up to where its wheels had carried it.

Another group adopted a backtracking strategy in which the robot realized something was wrong between its internal map and what its sensors were actually perceiving. A third group looked out for a rogue robot that zooms by carrying a bike image – a stand-in for a cyclist. The robot, after seeing the faux bike, would halt and play a snippet of the song “Bicycle Race” while waiting for the rogue robot to move out of its way safely.

Accelerated learning

The major part of the excitement around this class is that students get a good opportunity to work on a hands-on project, which requires them to move from computer simulation to real and complex hardware systems. Such a project would have been almost difficult to assign to a class 10 years ago.

Just a few years ago, this kind of project would have required large teams of researchers and significant investments,” said Pavone, whose laboratory develops planning, decision-making, and also artificially intelligent algorithms for autonomous robots, for example autonomous spacecraft, self-driving cars, and drones. “Now, leveraging a variety of tools recently developed by the robotics community, we can teach it in a quarter to undergraduates. This is a testament of how quickly this field is progressing.”

For two years, Pavone has been teaching this class and now he intends to make this course a part of the main curriculum of the aeronautics and astronautics undergraduate major, which is completely new this year. Considering the growing applications of autonomous technology bound for the sky, oceans, roads, and space, Pavone believes that the knowledge and information students gain from this class is very relevant across various fields of engineering.

Pavone is also an assistant professor, by courtesy, of electrical engineering. Principles of Robotic Autonomy was supported in part by CARS, the Center for Automotive Research at Stanford.

Students in the class Principles of Robotic Autonomy built the different components of the autonomy software during the quarter and then worked in teams to integrate such components and deploy them into their robots. (Credit: Stanford University)

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Submit