For a newborn wildebeest or giraffe, being born can be a hazardous introduction to the world as predators seek an opening to attack the herd’s weakest member for its meal. This is why numerous species have developed ways for their youngsters to find their stability within minutes of birth.
It is an amazing evolutionary act that has long enthused biologists and roboticists—and currently, a team of USC researchers at the USC Viterbi School of Engineering are certain they have become the first to build an AI-regulated robotic limb driven by animal-like tendons that can even be tripped up and then regain composure within the time of the following footfall, a mission for which the robot was never obviously programmed to accomplish.
Francisco J. Valero-Cuevas, a professor of Biomedical Engineering a professor of Biokinesiology & Physical Therapy at USC in a project with USC Viterbi School of Engineering doctoral student Ali Marjaninejad and two other doctoral students—Dario Urbina-Melendez and Brian Cohn, have formulated a bio-inspired algorithm that can learn a new walking task on its own after just 5 minutes of free play, and then adapt to other tasks without any further programming.
Their article, described in the March cover article of Nature Machine Intelligence, opens stimulating options for comprehending human movement and disability, designing responsive prosthetics, and robots that can interact with complex and variable environments such as space exploration and search-and-rescue.
"Nowadays, it takes the equivalent of months or years of training for a robot to be ready to interact with the world, but we want to achieve the quick learning and adaptations seen in nature," said senior author Valero-Cuevas, who also has appointments in computer science, electrical and computer engineering, mechanical and aerospace engineering and neuroscience at USC.
Marjaninejad, a doctoral candidate in the Department of Biomedical Engineering at USC, and the lead author of the paper said this innovation is similar to the natural learning that takes place in babies. Marjaninejad explains the robot was first given time to understand its surroundings in a process of free play (or what is referred to as “motor babbling”).
"These random movements of the leg allow the robot to build an internal map of its limb and its interactions with the environment," said Marjaninejad.
The paper's authors state that, in contrast to most recent work, their robots learn-by-doing, and without any previous or parallel computer simulations to facilitate learning.
Marjaninejad also added this is especially vital as programmers can predict and code for numerous situations, but not for every conceivable situation—thus pre-programmed robots are certainly susceptible to failure.
However, if you let these [new] robots learn from relevant experience, then they will eventually find a solution that, once found, will be put to use and adapted as needed. The solution may not be perfect, but will be adopted if it is good enough for the situation. Not every one of us needs or wants—or is able to spend the time and effort—to win an Olympic medal.
Ali Marjaninejad, Study Lead Author and Doctoral Student, Department of Biomedical Engineering, Viterbi School of Engineering, USC.
Through this process of ascertaining their body and surroundings, the robot limbs engineered at Valero Cuevas' lab at USC use their exclusive experience to form the gait pattern that works well enough for them, building robots with customized movements. "You can recognize someone coming down the hall because they have a particular footfall, right?" Valero-Cuevas asks. "Our robot uses its limited experience to find a solution to a problem that then becomes its personalized habit, or 'personality'—We get the dainty walker, the lazy walker, the champ... you name it."
The probable applications for the technology are numerous, especially in assistive technology, where robotic limbs and exoskeletons that are intuitive and receptive to a user's personal requirements would be instrumental to those who no longer are able to use their limbs.
"Exoskeletons or assistive devices will need to naturally interpret your movements to accommodate what you need," Valero-Cuevas said.
"Because our robots can learn habits, they can learn your habits, and mimic your movement style for the tasks you need in everyday life—even as you learn a new task, or grow stronger or weaker."
According to the authors, the research will also have solid applications in the domains of space exploration and rescue missions, enabling robots that do what has to be done without being chaperoned or managed as they venture into a new planet, or undefined and unsafe terrain in the aftermath of natural disasters. These robots would be able to acclimatize to low or high gravity, wobbly rocks one day and mud after it rains, for instance.
The paper's two other authors, doctoral students Brian Cohn and Dario Urbina-Melendez also contributed to the research.
The ability for a species to learn and adapt their movements as their bodies and environments change has been a powerful driver of evolution from the start. Our work constitutes a step towards empowering robots to learn and adapt from each experience, just as animals do.
Brian Cohn, Doctoral Candidate in Computer Science, Viterbi School of Engineering, USC.
"I envision muscle-driven robots, capable of mastering what an animal takes months to learn, in just a few minutes," said Urbina-Melendez, a doctoral candidate in biomedical engineering who trusts in the capacity for robotics to take daring inspiration from life. "Our work combining engineering, AI, anatomy and neuroscience is a strong indication that this is possible."