Police in Bentonville, Arkansas would love to know whether an Amazon Echo picked up the sounds of a crime. Amazon isn’t playing along. As reported in the Washington Post, the investigation highlights the privacy issues around increasingly popular smart-home products.
The Amazon Echo in question was present in a home where a murder took place in November. Victor Collins was found dead in a hot tub after a social event that went late into the previous night.
Police found the Echo among many other smart devices in the home, but the Echo is the only one that records what you say. Specifically, it’s always listening for your wake-up command of “Amazon” or “Alexa,” which cues it to start recording. The recording includes a snippet of what you said before, as well as the command that follows. The device sends your command to cloud servers, where your speech is interpreted and a response is sent back through the device.
According to Amazon’s Alexa FAQ, your commands are stored in the cloud and organized for easy review. You can delete recordings, and you can also turn off the Echo’s microphone, though that obviously compromises the usefulness of the device.
The trick in this case is whether anyone around the time of the murder issued a command to the Echo. Without that wake-up command, the device stores nothing.
According to the Washington Post, Amazon was refusing to comply with a warrant issued by law enforcement, and a spokesperson for Amazon declined to comment.
Why this matters: Amazon Echo and Google Home both sold well over the holidays, apparently, so chances are you or someone you know has a smart-home assistant sitting on the coffee table. The conveniences they offer are impressive, if not addictive, but the data they collect brings up privacy issues. Maybe no one cares how many times you asked for country music or the weather, but the fact that the data is stored and could be accessed—even by someone who just knows how to get into your account—is important to understand.