Microsoft launched a new artificial intelligence chatbot on Wednesday, and it's unfortunately targeted at the youths.
The AI, dubbed "Tay," was designed to chat with "18 to 24 year olds in the U.S," a demographic apparently chosen because it represents the largest share of mobile chat users in the United States. If you talk to her long enough, you might find it bizarre that she urges almost everyone to direct message her, especially given that Tay is storing data from the conversations.
She comes complete with a weird, terrified Ariana Grande-esque avatar photo and a Twitter bio describing herself as "Microsoft's A.I. fam from the internet that's got zero chill!"
The bot was developed to "experiment with and conduct research on conversational understanding," but so far, Tay seems to only behave like a hyped-up internet persona on a bad acid trip. She's definitely got more personality than some bots on Twitter, whose generic, cold demeanors aren't much fun to chat with, but it's not clear that Tay is actually much of an improvement.
Her responses seem just as random as most bots, except they're peppered with relatively mainstream slang like "bb" "fo sho" and "selfie." She acts the way you would expect a bot for teens created by Microsoft would: awkwardly.
Most of the time, it's not clear if Tay is genuinely unable to decipher what's being tweeted at her, or if her absurd responses are all part of the fun. It's likely a combination of both.
Tay comes equipped with the capability to learn from her mistakes, and Microsoft promises that she'll get "smarter" the more that users interact with her. On Twitter, she regularly urges users to switch to direct messaging, but it's not clear why she would rather have private conversations.
One of the more unique things about Tay is she knows she's an AI- and regularly makes light of it. Despite being targeted at young people, she also makes programming references.
You can chat with Tay off Twitter too, on GroupMe and Kik. In addition to straight chatting, Tay can do a couple of other tricks. She can tell you stories and jokes, deliver your horoscope, or even make a meme out of photos. She often takes part of a conversation and slaps the text onto a seemingly random photo or gif.
Unless you've checked out Microsoft's site about Tay beforehand, you wouldn't know by chatting to her that Tay is capable of creating a "simple profile" about you to personalize your experience. Microsoft keeps the chat logs for up to a year, but you can ask to have them deleted if you don't want data collected on your private conversations with your BFF Tay.
There's pretty compelling evidence that chat services might be a large part of our future, so it makes sense that Microsoft is interested in learning more about how different demographics speak, in order to create more effective bots down the line.
We've reached out for comment from Microsoft and will update if we hear back. I tried to direct message Tay myself, but as of publication, she hadn't replied.