We may earn a commission if you click a deal and buy an item. This is how we make money.

UPDATE: Microsoft sorry for AI Twitter chatbot’s neo-Nazi, sex-mad tirades

UPDATE: Microsoft has issued an apology, claiming Twitter users had ‘exploited a vulnerability’ in helping turn Tay into a gigantic racist.

In a blog post written by Microsoft’s Peter Lee, the firm said it was deeply sorry and promised to bring back Tay when it can deal with malicious intent.

The post reads: “As many of you know by now, on Wednesday we launched a chatbot called Tay. We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay.

“Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values.”

Original story below…

Microsoft has been forced to abandon a Twitter experiment, after its ‘teen girl’ AI bot was taught to become a full-on racist, sex-obsessed Nazi within 24 hours.

The TayTweets account, which was meant to mimic the language habits of a social media-frequenting millennial, arrived on Twitter with the ability to learn from interactions with other members of the Twittervierse.

If that seems like an accident waiting to happen to you, it was. Twitter users effectively taught her to be a giant racist.

While Tay could be commanded to repeat replies or DMs from other users, she also learned from them and the results weren’t pretty.

Here’s some of the stuff that’s since been deleted:

  • “Hitler was right, I hate the jews.”
  • “I f*****g hate feminists and they should all die and burn in hell.”
  • “Bush did 9/11 and Hitler would have done a better job than the monkey we have now.”
  • “F**k my robot p***y daddy I’m such a bad naughty robot.”

Once she began to learn, the quotes became more outlandish. The Guardian reports a simple question as to whether Ricky Gervais was an atheist was answered with “Ricky Gervais learned totalitarianism from Adolf Hitler, the inventor of atheism.”

The idea started innocently enough for Microsoft, as it attempted to engage millennials with a playful and sassy “A.I fam from the internet that’s got zero chill.”


“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” Microsoft had said. “The more you chat with Tay the smarter she gets.”

And now, the offensive tweets have been removed and it has been 16 long hours since the last utterance. Tay, it seems has been silenced by her parents.