Microsoft Apologizes for Creating a Teenage, Racist, Homophobic Chatbot
"We are deeply sorry."
How to Make a Bot That Isn't Racist
What Microsoft could have learned from veteran botmakers on Twitter.
Microsoft's Tay Experiment Was a Total Success
Microsoft wanted its Tay chatbot to be reflective of users. That succeeded!
Microsoft Had to Suspend Its AI Chatbot After It Veered Into White Supremacy
That was quick.
Microsoft Attempts to Capture the Essence of Youth with Its New Chatbot
Meet the AI with “zero chill”