Amazon Alexa user received audio files of a stranger at home with his female companion
- Company blames ‘human error’ for sending Germany customer recordings made by device belonging to someone else
- Privacy problem highlights the risk of keeping an always-on, internet-connected microphone in intimate spaces
When a person using Amazon.com’s voice assistant in Germany asked to listen to his archive of recordings, he got much more than he was expecting.
Besides receiving his own audio history captured by a home microphone the user also gained access to 1,700 audio files from a person he did not know.
Amazon sent the man a link that contained a stranger’s recordings, allowing him to listen to another man speaking inside his home with a female companion, according to German trade magazine, c’t.
“This was an unfortunate case of human error and an isolated incident,” Amazon told The Washington Post in a statement on Thursday. “We have resolved the issue with the two customers involved and have taken steps to further improve our processes. We were also in touch on a precautionary basis with the relevant regulatory authorities.”
The first man notified Amazon of the improperly shared recordings, according to the report. Amazon deleted the files from the link the company had accidentally shared with him. But the violation of privacy had already transpired. After Amazon had sent the user the link, he downloaded the audio recordings of the stranger to his computer.
The incident in Germany follows a widely covered Alexa privacy mishap that occurred much closer to Amazon’s home. Earlier this year, a family in Portland, Oregon, discovered its Alexa-powered Echo device recorded their private conversation and sent it to a random person in their contacts list.
The disturbing event, first reported by Washington state’s KIRO 7, went viral, highlighting the risk of keeping an always-on, internet-connected microphone in someone’s most intimate spaces.
The mistake also drew attention to the uncanny readiness with which American consumers have accepted microphone-linked voice assistants into their homes and how the devices actually work.
Voice-based devices like Amazon’s Echo and Google Home are always “awake”, passively listening for commands to activate. A user can mute the devices and they can also listen to past recordings and delete them. Google and Amazon keep a copy of every conversation.
Amazon has not disclosed how many times users have improperly been granted access to another person’s recordings. Amazon said the company apologised to the person who received the audio files and to the person whose conversations were accidentally shared.