Posted in | News | Drones and UAVs

New Interface Allows Users to Direct Drones Through Hand Gestures

Scientists from Skolkovo Institute of Science and Technology (Skoltech) have designed and developed a new interface that enables users to guide a small drone to light-paint patterns or letters via hand gestures.

DroneLight: Drone Draws in the Air using Long Exposure Light Painting and ML. IEEE RO-MAN 2020

Video Credit: Skolkovo Institute of Science and Technology.

Called DroneLight, the new interface can be utilized in entertainment, remote communications, and also in search and rescue operations. The study was published on the preprint server and presented at the IEEE International Conference on Robot & Human Interactive Communication (IEEE RO_MAN 2020).

Drones are turning out to be ubiquitous both in consumer and industrial applications, and engineers are exploring ways to make the interaction between humans and drones as natural and dependable as possible.

But as the study points out, “up to now, the available technologies have not made it possible to control drones intuitively without special training and additional control equipment.”

Flight control is a challenging task as user has to manipulate with the joystick to stabilize and navigate drones. Only a very skillful operator can maintain smooth trajectory, such as drawing a letter, and for the typical user it is almost not possible.

Dzmitry Tsetserukou, Study Co-Author and Professor, Skolkovo Institute of Science and Technology

Tsetserukou, Roman Ibrahimov, and Nikolay Zherdev from the Skoltech Intelligent Space Robotics Laboratory have designed a system that enables simple interaction with a micro-quadcopter with LEDs that can be utilized for light-painting.

The team utilized a 92x92x29 mm Crazyflie 2.0 quadrotor that weighs only 27 g and is integrated with a light reflector and a range of RGB LEDs that can be controlled.

The control system includes a base station that runs a machine learning algorithm and also a glove fitted with an inertial measurement unit (IMU; an electronic device that monitors the movement of a user’s hand).

The Machine Learning Algorithm

The machine learning algorithm matches the gestures of the users to pre-defined patterns or letters and guides the drone to light-paint them. Doing their experiment, the engineers outlined five various letters (S, K, O, L, and J) and trained a Random Forest Classifier algorithm to link the hand gestures for such letters to equivalent drone trajectories.

The researchers have planned to develop their system by adding more letters to its “alphabet”, introducing more user gestures to the dataset, and producing a more accurate and faster machine learning algorithm.

The most fascinating application can be DroneMessenger, when partners can not only exchange messages and emoji over the distance but also enjoy the light art during a starry night.

Dzmitry Tsetserukou, Study Co-Author and Professor, Skolkovo Institute of Science and Technology

Another application is a show of drones when an operator can generate dynamic light patters in the sky in real-time. You can also imagine another system, SwarmCanvas, where users located in remote places can draw a joint picture on the canvas of the night sky. Currently, drone show systems just reproduce predesigned trajectories and lighting patterns,” concluded Tsetserukou.


Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.