How malicious sounds can take control of a self-driving car

Technologies for intruding electronic devices are becoming more sophisticated. Now not only simple gadgets are under threat, but also the control systems of autonomous vehicles. This was confirmed by the results of recent studies by scientists from the University of Michigan. In particular, a special sound signal served as a "hacking tool".

Numerous devices with built-in accelerometers that estimate the acceleration of an object in real time were in the "risk group". Among them are practically all modern gadgets, drones, including autonomous cars, implantable medical electronics, as well as various industrial systems.

Scientists have found that by acting on these devices with a finely tuned acoustic tone, you can make them give out false, moreover, controlled information. As an example, they used a melody from a cheap speaker to cause devices to give thousands of false readings.

In the next experiment, scientists managed to take control of an Android smartphone with a remote control application for a toy car, depending on the tilt of the device. For this they used a special audio file. Playing it made the accelerometer give out the necessary information, which was sent to the car's control system. As a result, it was possible to gain complete control over the movement of the toy.

There is still a long way to go before an attacker with a boombox on his shoulder can break into an unmanned vehicle at an intersection. However, researchers already want to warn drone developers of the dangers of relying entirely on data from sensors.