The murder in Bentonville, Arkansas, looks gruesome. On a morning in November 2015, James Andrew Bates, 31, found Victor Parris Collins, 47, dead in his hot tub after the two drank and watched football with a group of friends the previous night. Local law enforcement alleges Bates strangled Collins.
A month later, a judge granted the Bentonville police a warrant for some of Bates’ data on Amazon’s servers — to gather evidence about what went down. Police said Bates’ Amazon Echo device was streaming music that night and may have caught a recording of what happened. While court documents show Amazon twice declined to cooperate with the warrant, Arkansas police arrested Bates in February 2016. The case is set to go to trial next year.
For tech companies and consumers alike, Collins’ murder poses a new challenge. It’s apparently the first time an “intelligent” speaker, like Echo, has been roped into a court case — and thus, it’s the latest chapter in a perpetually shifting debate over what privacy rights look like as people share more information and corporations store it or use it for marketing.
Amazon and Google both make artificially intelligent speakers they hope will become the digital rug that pulls together a customer’s room of gadgets and media. Google Home and Amazon Echo, they claim, will help people play music, find recipes, adjust their thermostats, and much more.
The thing is that these devices — or services like Apple’s Siri — are always listening, waiting for users to beckon them. And as The Information reported Tuesday, the Echo’s listening skills are what led Arkansas investigators to ask for the warrant.
Technologically, it’s a bit of stretch. Amazon Echo and Google Home scan and record brief snippets of audio to catch “hotwords” or “wake words,” like “Alexa” or “OK, Google,” that then activate the devices. Google said in a statement that “if the hotword is not detected on that short snippet, the snippet is immediately discarded.” Amazon explained in an email that the Echo “does not stream utterances to the cloud until [a] wake word is detected and the blue light ring is on.”
Both companies also say their devices record users’ exchanges with them only after the device has woken up and continue recording only as long as the interaction lasts. Recorded data, if it exists, isn’t stored on the device itself, but rather on Amazon’s servers.
More broadly, using audio evidence in a murder case is tricky. If Bates’ arrest was based on any information from his Echo, Louisiana State University criminology professor Peter Scharf doubts it will hold up in court.
“Can you actually develop an arrest from an acoustic sound? One problem is establishing a link to a person,” Scharf said. “So while lots of people think you can do it, courts have not yet ruled if that’s a sufficient basis for probable cause. You need something else, a predicate, another element, to know something until [the Echo recording] becomes relevant.”
Scharf noted a judge might even throw out the evidence.
Still, he worries about what the warrant could mean for privacy rights in general. Law enforcement nationwide, as Scharf pointed out, is “under-trained, especially in the area of cyberethics.” Furthermore, tech companies are traditionally hesitant to give authorities access to user information (like Apple was with the FBI’s request for a backdoor) because of the risks that allowing third-party surveillance inherently poses to all users.
Albert Gidari, director of privacy at the Stanford Center for Internet and Society, also thinks that criminal implications of a device like Echo, given the current technological constraints, are flimsy. But larger concerns come into play in a constantly mic’d up world.
“We all agree that if someone says, ‘Alexa, how do I wash the blood from my hot tub?’ then, yeah, the police would get a warrant,” Gidari said. A bigger problem, in his mind, is that users may not be aware of how to prevent Amazon, Google, and other companies from collecting more information than they would want tech services to be able to access.
While Google and Amazon both allow users to directly manage and delete their voice-recorded data — and both firms say they give data over to the authorities only when presented with a warrant — consumers generally have a tough time navigating these kinds of privacy features.
“The complexity of figuring out what’s public or not public, or what to share, is almost overwhelming for people day-to-day,” Gidari said. “And these sorts of stories illuminate different aspects of it.”
“But,” he added, “there’s a danger of fear-mongering for every new technology that’s going to create some harm in the abstract.”