Microsoft's ChatBot Returned, Said She Smoked Weed in Front of the Cops, and Then Spun Out
The bot, named "Tay," was taken down by the tech giant last Friday after she took to Twitter in a vitriolic, racist tirade. It didn't take long for Tay to run wild again.
Microsoft Is 'Deeply Sorry' Its Artificial Intelligence Bot Became Horribly Racist
The company issued a formal apology for hateful remarks made by its bot "Tay," and blamed the incident on a "coordinated attack" by a group of Twitter users.