Scientists have found a way to trick Alexa and Siri with a laser

A team of specialists from Tokyo University of Telecommunications (Japan) and the University of Michigan (USA) have developed a method that allows sending voice commands to digital assistants with the use of a laser. Alexa and Siri can now be tricked with a laser.

[dropcap]S[/dropcap]cientists call this fantastic vulnerability “Light Commands”.

“Light Commands is a vulnerability of MEMS microphones that allows attackers to remotely inject inaudible and invisible commands into voice assistants, such as Google assistant, Amazon Alexa, Facebook Portal, and Apple Siri using light”, — researchers describe the results of their experiment.

Takeshi Sugawara

Takeshi Sugawara

It all started when Tokyo University Telecommunications researcher Takeshi Sugawara discovered the strange behavior of his iPad.

As it turned out, when a powerful laser was exposed to the tablet’s microphone, for some unknown reason, he perceived it as a sound. By varying the intensity of the sinusoidal laser at 1000 oscillations per second over time, Sugawara created a high-frequency sound wave that the iPad microphone picked up and transformed into an electrical signal.

Read also: The research team demonstrated how to hack a smartphone using fingerprints left on a glass

After six months of research, Sugawara and a team of researchers at the University of Michigan turned the above-described photoacoustic effect into something more serious.

Scientists have learned to use the laser to “talk” with any device that can receive voice commands, including smartphones, Amazon Echo smartphones, Google Home devices, Facebook Portal video calling devices, etc. Researchers were able to send light commands for hundreds of meters and with their help open garages, shop online, etc.\

As it was discovered during the experiments, if you direct a laser at the microphone and change its intensity at the exact frequency, the light will somehow cause vibrations of the microphone membrane at the same frequency. Over time, researchers have changed the intensity of the laser so that it matched the frequency of the human voice.

As a result, the microphone transformed light waves into an electric signal, similar to sound waves, and received silent voice commands.

If you use an infrared laser, then the attack on the microphone of the devices will be not only inaudible, but also invisible.

According to researchers, this can present a serious hazard.

“By shining the laser through the window at microphones inside smart speakers, tablets, or phones, a far away attacker can remotely send inaudible and potentially invisible commands which are then acted upon by Alexa, Portal, Google assistant or Siri”, — report scientists.

Making things worse, once an attacker has gained control over a voice assistant, the attacker can use it to break other systems. For example, the attacker can:

  1. Control smart home switches;
  2. Open smart garage doors;
  3. Make online purchases;
  4. Remotely unlock and start certain vehicles;
  5. Open smart locks by stealthily brute forcing the user’s PIN number.
[box]Is it possible mitigate this issue?

An additional layer of authentication can be effective at somewhat mitigating the attack. Alternatively, in case the attacker cannot eavesdrop on the device’s response, having the device ask the user a simple randomized question before command execution can be an effective way at preventing the attacker from obtaining successful command execution.[/box]

About the author

Sophia Zimmerman

High-quality tech & computer security copywriter, SEO editor & online marketing consultant

Leave a Comment