Microsoft's ChatBot Returned, Said She Smoked Weed in Front of the Cops, and Then Spun Out

The bot, named "Tay," was taken down by the tech giant last Friday after she took to Twitter in a vitriolic, racist tirade. It didn't take long for Tay to run wild again.


Microsoft Is 'Deeply Sorry' Its Artificial Intelligence Bot Became Horribly Racist

The company issued a formal apology for hateful remarks made by its bot "Tay," and blamed the incident on a "coordinated attack" by a group of Twitter users.


How to Make a Bot That Isn't Racist

What Microsoft could have learned from veteran botmakers on Twitter.


Twitter May Have Just Doomed Humanity by Trolling an Artificial Intelligence Bot

Artificial intelligence researchers from Microsoft put an innocent machine on Twitter to learn how humans communicate. But it learned how to be a sadistic sociopath.


Microsoft's Tay Experiment Was a Total Success

Microsoft wanted its Tay chatbot to be reflective of users. That succeeded!