Introducing X1: The First Multirobot System Combining a Humanoid and a Launchable Drone

Caltech’s Center for Autonomous Systems and Technologies (CAST) and the Technology Innovation Institute (TII) in Abu Dhabi have unveiled X1—the world’s first multirobot system that pairs a humanoid robot with a drone that can launch from its back, then drive away on wheels.

Engineers from the CAST/TII collaboration with X1, a humanoid robot that can carry and launch M4, Caltech
Engineers from the CAST/TII collaboration with X1, a humanoid robot that can carry and launch M4, Caltech's morphing robot. Image Credit: Academic Media Technologies/Caltech

This novel system is the result of a three-year collaboration between the two institutions, combining expertise in robotics, AI, autonomous systems, and propulsion. The outcome is a new kind of robotic mobility—one that leverages the strengths of walking, flying, and driving robots in a single integrated package.

Right now, robots can fly, robots can drive, and robots can walk. Those are all great in certain scenarios. But how do we take those different locomotion modalities and put them together into a single package, so we can excel from the benefits of all these while mitigating the downfalls that each of them have?

Aaron Ames, Director and Booth-Kresa Leadership Chair, Center for Autonomous Systems and Technologies

To test that concept, the team recently staged a demonstration on Caltech’s campus. The scenario imagined an emergency situation requiring autonomous agents to respond quickly. For the demo, the researchers modified a Unitree G1 humanoid to carry Caltech’s M4 (an existing robot that can fly and drive) like a backpack.

The sequence began at the Gates–Thomas Laboratory, where the humanoid walked through Sherman Fairchild Library and exited to a raised platform. There, it deployed the M4 by bending forward, allowing the robot to launch in drone mode.

After landing, M4 shifted to driving mode to continue toward its target. When it encountered Caltech’s Turtle Pond, M4 switched back to flight mode, soared over the obstacle, and completed the mission near Caltech Hall. A second M4 and the humanoid later regrouped at the emergency site.

The challenge is how to bring different robots to work together so, basically, they become one system providing different functionalities. With this collaboration, we found the perfect match to solve this.

Mory Gharib, the Hans W. Liepmann Professor, Aeronautics and Medical Engineering, California Institute of Technology

Gharib’s group, which originally developed M4, focuses on multimodal mobility and control systems. The Ames lab contributes deep knowledge of robot locomotion and safe control algorithms, while TII specializes in robotic autonomy and sensing in complex environments. Additional support came from Northeastern University, where engineer Alireza Ramezani’s team works on morphing robot design.

The overall collaboration atmosphere was great. We had different researchers with different skill sets looking at really challenging robotics problems spanning from perception and sensor data fusion to locomotion modeling and controls, to hardware design,” said Ramezani, Associate Professor, Northeastern.

When TII engineers visited Caltech in July 2025, the teams co-developed an upgraded version of M4, incorporating Saluki—a secure flight controller and onboard computer platform built by TII. The next phase will focus on adding advanced sensors, model-based algorithms, and machine learning to help the system navigate and adapt to real-world environments autonomously.

We install different kinds of sensors—lidar, cameras, range finders—and we combine all these data to understand where the robot is, and the robot understands where it is in order to go from one point to another. So, we bring the capability of the robots to move around with autonomy.

Claudio Tortorici, Director, Technology Innovation Institute

But there’s more to X1 than meets the eye. According to Ames, the humanoid robot isn’t just mimicking pre-recorded human movements—a common approach in robotics. Instead, his team is working toward real-time, adaptive locomotion.

Ames argued, “If we want to really deploy robots in complicated scenarios in the real world, we need to be able to generate these actions without necessarily having human references.”

His lab develops mathematical models that represent the physics of movement, which are then combined with machine learning.

The robot learns to walk as the physics dictate. So X1 can walk; it can walk on different terrain types; it can walk up and down stairs, and importantly, it can walk with things like M4 on its back,” said Ames.

A key goal of the project is to improve safety and reliability in autonomous systems.

I believe we are at a stage where people are starting to accept these robots. In order to have robots all around us, we need these robots to be reliable,” says Tortorici.

That’s a priority for the research team.

We're thinking about safety-critical control, making sure we can trust our systems, making sure they're secure. We have multiple projects that extend beyond this one that study all these different facets of autonomy, and these problems are really big. By having these different projects and facets of our collaboration, we are able to take on these much bigger problems and really move autonomy forward in a substantial and concerted way,” concludes Ames.

A Symphony of Robotic Motion - Collaboration Between Caltech & TII

Video Credit: California Institute of Technology

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

Sign in to keep reading

We're committed to providing free access to quality science. By registering and providing insight into your preferences you're joining a community of over 1m science interested individuals and help us to provide you with insightful content whilst keeping our service free.

or

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.