Scientists Hack Robotic Vacuum Cleaner to Record Music and Speech Remotely

Scientists have shown how commonly used robotic household vacuum cleaners could be remotely hacked to serve as microphones.

Researchers repurposed the laser-based navigation system on a vacuum robot (right) to pick up sound vibrations and capture human speech bouncing off objects like a trash can placed near a computer speaker on the floor. Image Credit: Sriram Sami.

The team—including Nirupam Roy, an assistant professor in the Department of Computer Science at the University of Maryland—gathered data from the laser-based navigation platform in a common vacuum robot and applied signal processing and deep learning methods to recover speech and detect television programs running in the same room as the device.

The study shows the possibility for any device that employs light detection and ranging (LiDAR) technology to be exploited for collecting sound, regardless of not featuring a microphone.

This study, which is a partnership with assistant professor Jun Han at the University of Singapore, was presented on November 18th, 2020, at the Association for Computing Machinery’s Conference on Embedded Networked Sensor Systems (SenSys 2020).

We welcome these devices into our homes, and we don't think anything about it. But we have shown that even though these devices don't have microphones, we can repurpose the systems they use for navigation to spy on conversations and potentially reveal private information.

Nirupam Roy, Assistant Professor, Department of Computer Science, Institute for Advanced Computer Studies, University of Maryland

Household vacuum bot’s LiDAR navigation system shines a laser beam across a room and detects the reflection of the laser as it bounces off adjacent objects. Using the reflected signals, the robot maps the room and prevents crashes as it travels around the house.

Privacy specialists have proposed that the maps created by vacuum bots, which are usually stored in the cloud, pose possible privacy breaches that could offer advertisers access to data regarding such things as home size, which indicates income level, and other lifestyle-associated information.

Roy and his colleagues speculated if the LiDAR in these robots may also pose possible security risks as sound recording gadgets in businesses or users’ homes.

Sound waves make objects vibrate, and these vibrations cause small differences in the light bouncing off an object. Used for spying since the 1940s, laser microphones could convert those differences back into sound waves. However, laser microphones depend on a targeted laser beam bouncing off highly smooth surfaces, such as glass windows.

By contrast, a vacuum LiDAR scans the location with a laser and detects the light scattered back by objects that are asymmetrical in density and shape. The scattered signal detected by the vacuum’s sensor offers just a fraction of the information required to recover sound waves.

The team was uncertain if the LiDAR system of a vacuum bot could be controlled to work as a microphone and if the signal could be deduced into significant sound signals.

At first, the team hacked a robot vacuum to demonstrate they could manipulate the position of the laser beam and transmit the sensed data to their laptops via WiFi, without disturbing the navigation of the device.

They then carried out experiments using two sound sources. The first source was a human voice vocalizing numbers played over computer speakers, and the other one was audio from different television shows played via a TV soundbar.

Roy and his team then captured the laser signal detected by the vacuum bot’s navigation system as it was reflected off a range of objects positioned near the sound source. The objects included a trash can, takeout container, cardboard box, and polypropylene bag—items that might typically be left on a floor.

The team transmitted the signals they received via deep learning algorithms taught to either match human voices or to detect musical sequences from television shows. Their computer system, which they have named LidarPhone, detected and matched spoken numbers with an accuracy of 90%. It also detected television shows from just a minute’s worth of recording with over 90% accuracy.

This type of threat may be more important now than ever, when you consider that we are all ordering food over the phone and having meetings over the computer, and we are often speaking our credit card or bank information.

Nirupam Roy, Assistant Professor, Department of Computer Science, Institute for Advanced Computer Studies, University of Maryland

But what is even more concerning for me is that it can reveal much more personal information. This kind of information can tell you about my living style, how many hours I'm working, other things that I am doing. And what we watch on TV can reveal our political orientations. That is crucial for someone who might want to manipulate the political elections or target very specific messages to me,” added Roy.

The researchers highlight that vacuum cleaners are only one example of probable vulnerability to LiDAR-based hacking. Several other gadgets could be vulnerable to similar hacks such as passive infrared sensors used for detecting motion or smartphone infrared sensors employed for face recognition.

I believe this is significant work that will make the manufacturers aware of these possibilities and trigger the security and privacy community to come up with solutions to prevent these kinds of attacks.

Nirupam Roy, Assistant Professor, Department of Computer Science, Institute for Advanced Computer Studies, University of Maryland

Source: http://www.umdrightnow.umd.edu/

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.