Alexa is learning how to interact more like a person — with mixed results.
It turns out Amazon’s virtual assistant has offered up suggestions like “kill your foster parents” and what Reuters described only as “sex acts.”
The Reuters piece, citing anonymous sources, said the recently reported gaffes were due to the voice aide’s “let’s chat” feature, which uses artificial intelligence to help Alexa chat more like a human and bring up topics she finds on the internet. Alexa performed 1.7 million such conversations using chatbots in a four-month span alone, according to Reuters. As for the customer who got the “kill your foster parents” message, the chatbot feeding Alexa phrases was pulling a Reddit quote without context.
It could be a case of "accidental ingestion," said an Amazon spokesperson. "Since the Alexa Prize teams use dynamic data, some of which is sourced from the internet, to train their models, there is a possibility that a socialbot may accidentally ingest or learn from something inappropriate,” the spokesperson said in a statement to VICE News. “These instances are quite rare, especially given the fact that millions of customers have interacted with the socialbots.”
And that’s not the only weird thing to happen with Alexa recently. A man in Germany said that after he contacted Amazon about his archived data, he received 1,700 audio recordings from Alexa of a complete stranger talking privately, Reuters reported Thursday. In May, an Amazon Echo recorded an Oregon couple talking about hardwood floors, and then randomly sent that conversation to someone in their contacts. And, while benign, Alexa experienced a glitch this year that caused her to start randomly — but creepily — laughing.
Reuters also reported that a hack Amazon traced back to China may have exposed some customers’ data, citing anonymous sources.
Cover : In this Sept. 27, 2017, file photo, a new Amazon Echo is displayed during a program announcing several new Amazon products by the company, in Seattle. (AP Photo/Elaine Thompson, File)