Stories
Voice Assistant Is Suspected to Be a Traitor
Next Post

Press {{ keys }} + D to make this page bookmarked.

Close
Photo: photo: digital.report

Voice Assistant Is Suspected to Be a Traitor

177

SEATTLE - May 28, 2018

"The walls speak," people would say when they wanted to protect a secret or a private conversation. Now, in the age of smart cell phones, the fear is that these devices are the ones that reveal these private conversations.

The case of a family from Seattle, Washington, in the United States, who suffered a security breach in their own home through an Amazon personal assistant known as Echo, has triggered the alarms. The family was in their residence when they received a call from a friend who said: "Turn off Alexa. They are hacking you. " Apparently this device had recorded the family conversation and had sent someone on the contact list on the device, without any authorization.

It is not the first time something like this happens with these devices. Last month researchers reported a flaw in Alexa, Amazon's voice assistant, which was activated when he mistakenly heard a command of a live conversation. According to a statement sent to The Washington Post, Amazon spokesmen explained that Echo was activated when he heard the sound 'Alexa'. Then he heard the command to send a message to what the robot answered 'to whom? From that same conversation came the name of the contact to which it was sent. "These machines can interpret human voices in the wrong way," says Wenchao Zhou, an expert in computer science at Georgetown.

Hearing what you do not touch is not Amazon's exclusive problem. Last year Google had problems with its Home Mini, a voice-controlled speaker used to listen to music and manage electronic devices, because it was found that it constantly recorded sounds at home and sent them to Google. Recently, the researchers learned that Siri, Alexa and the Google assistant listened to secret audio instructions that were undetectable to the human ear.

Although Amazon says it is evaluating these cases so that they do not happen again, the incident puts the finger on the hot topic of security in these devices, which are called the Internet of Things, a term used to describe appliances that They have sensors connected to the network. In March of this year, Kaspersky lab found a security breach in smart cameras that are frequently used to monitor babies when they are alone or for the safety of home and office. These vulnerabilities would allow hackers to remotely access audio and video transmissions from cameras, which could remotely disable them or execute malicious arbitrary codes.

But it is more worrisome with voice assistants such as Siri, Alexa and Cortana because almost everyone has assets on their mobile phones without knowing that they are listening to their conversations and sending them to anyone in the contact list. After all, they are computers with microphones and speakers and are connected to the network, which, experts say, could turn into espionage devices. According to Goeffrey Fawler, technology expert at The Washington Post, these devices are always awake and passively listening to a command that activates them as "Alexa, O.K. Google or hey Siri. " The problem is that these devices are not perfect and respond at different times when the user wants it. This is partly because separating a voice command from a similar sound in a conversation at home is not easy. The spokespersons of Amazon, however, say that these events are very scarce and that they will work to make it even less so.

It’s not a secret that modern technical equipment provide remote activation of the microphone and the camera of the phone, which leads to unauthorized interception of the conversations and unauthorized photo and video. It is possible to select the signal from the antenna and the microphone of the mobile phone and to intercept them before the signal will be adopted by the nearest GSM station

Special device called "trap IMSI" (International Mobile Subscriber Identity — a unique identifier that is written in the SIM card), simulated to the nearby mobile phone real base station of the cellular network. This trick is possible because in the GSM standard a mobile phone is obliged to authenticate itself at the request of the network, but the network itself (the base station) should not confirm its authenticity to the phone. Once the mobile phone accepts the IMSI trap as its base station, it can deactivate the encryption function enabled by the subscriber and operate with the usual open signal, transmitting it further than the current base station. Today this trick is successfully used by the American police. According to the Wall Street Journal, the U.S. Department of justice collects data from thousands of American citizens ' mobile phones through devices that mimic cell towers. These devices, known as dirtbox, are placed on Board Cessna aircraft and are designed to catch persons suspected of committing crimes. According to sources familiar with the project, the program has been in service with the U.S. marshals Service since 2007 and covers most of the country's population.

Meanwhile, all those who have these technologies in their homes or cell phones are wondering what to do so that the same does not happen to the Seattle family. Fowler recommends disabling some features of the Alexa application, such as allowing purchases with voice commands. Another option is to turn off the option for Alexa to activate automatically. It is also possible to remove the sound and disconnect the microphone. The most radical of all is to turn them off, as did the Seattle family that is now asking Amazon to return the money.

Author: USA Really