A total of 24 EPFL Master’s students recently participated in a unique race in which the competitors were tandems of robots and students. The race was part of the summer series student project of the Data and AI for Transportation class given by Alexandre Alahi, an assistant professor at EPFL’s Visual Intelligence for Transportation (VITA) laboratory.
This challenge put the programming skills of the students to the test, in search of better techniques of human-machine interaction. The students stood at the starting line, holding up signs meant to get their robots to follow them. The student-robot tandem that reached the finish line first would be the winner.
Programming recognition algorithms
We had to program our robot to recognize a visual signal captured by an embedded camera and then follow the signal. That required developing our own algorithm and programming it in the robot.
Rayan Abi Fadel, Student, College of Management of Technology, EPFL
The same robot, a Segway Loomo, had to be used by all the participating teams. They also had to use the same base algorithm formulated by two VITA PhD students, Yuejiang Liu and George Adaimi. But they could adjust the algorithm and modify it using deep learning and AI techniques.
The base algorithm was developed to train a deep neural network to identify an object’s position in any sort of image, with the goal of forming a database of synthetic images obtained by “pasting” the object on unplanned backgrounds.
“That let us fine-tune some of the algorithm’s parameters and consequently improve our robot’s performance,” says Alexandre Carlier, a computer science student who won third place. “For example, since we wanted our robot to follow an object while the student holding it was running, we artificially blurred the background of the images used to train the robot, to simulate the effect of rapid movement.”
The majority of participating students knew how to program, but not all. Those who had to learn along the way included Sergej Gasparovich and Linah Charif, the two civil engineering students who won the first place. “It’s great to see that students who come into the class not knowing how to program can pick up that skill and do so well,” says Alahi.
Some of the key difficulties the students had to solve were screening out the disturbance caused by other race participants and dealing with lighting fluctuations in the room where the race was conducted. “Turns were another big difficulty. We had to make sure we would always be in our robot’s line of vision so that it could keep following us,” says Carlier.
Many of the student teams programmed their robots to identify an image instead of a face, as image signals are richer and less vulnerable to interference. The images spanned from a red circle or a banana to a glass of wine, the Swiss flag, or even Mickey Mouse.
Technology for self-driving cars, and more
The algorithms that the students created are akin to those used in self-driving cars, which help the vehicles to identify street signs, pedestrians, traffic lights, and other cars.
Our laboratory aims to develop technology that helps humans and machines coexist.
Alexandre Alahi, assistant professor, Visual Intelligence for Transportation (VITA) laboratory, EPFL
For a range of applications spanning from drones that deliver packages to robots that help the elderly carry their groceries or luggage, it is crucial that the machines have a general grasp of human behavior and act with a minimum amount of social intelligence. For instance, a robot moving through a crowd has to be able to obey the right ethical and social conventions for a particular situation. This race at EPFL shows that humans and robots can live alongside each other in small spaces.
Students participating in the race will receive a grade that will be determined by not only on what place they came in. “They are also graded on the work they did upfront on the base algorithm,” says Adaimi. “I hope that the race made our class more interesting and will encourage students to go further in this field of research.”