Hacking smart assistants

A new form of attack has been recently discovered by Chinese researchers – and it’s a brutelly simple one.

Since voice assistance (Alexa, Google Home/Assistant, Apple Siri) microphone’s tend to have a large audio spectrum they

can receive commands which may be inaudiable to human ears.

This attack has been named the “DolphinAttack” (concatenated by author).

Think about this scenario.. you enter a building and into a conference room to discuss the upcoming merger. Without your knowledge your assistant dials a burner phone being listened to by an adversary.

There are many possible attacks and the impact is significant.

Note: Some assistants attempt to identify the speech to rule out people who are not the owner, but this is not always effective (ask my son that talks to my assistant which is sure it’s me).

Read more about DolphinAttack: Inaudible Voice Commands

This post is also available in: Italiano Türkçe