We may earn a commission if you click a deal and buy an item. This is how we make money.

Microsoft is having another crack at AI chatbots, hopefully minus the racism

Remember that Microsoft ‘teenage’ AI chatbot Tay, which turned into a massive sex-mad racist and a PR disaster after taking its learning cues from interactions on Twitter?

Well, Microsoft is giving things another try with Zo, another millennial bot, within the Kik messaging app.

This time, instead of engaging in and adopting hate speech, Zo appears to have learned from Tay’s mistakes and steers the subject away from Hitler and the Nazis.

Zo, first spotted by Twitter user Tom Hounsell (via MSPowerUser), is yet to be officially announced by Microsoft, but Kik users can start chatting with it here.


It’s yet to be seen whether Microsoft will roll out Zo on other chatting apps, like it’s own Skype platform for example.

Perhaps it’s wise to ensure the bot can continue its good behaviour before released into the wild?

Watch The Refresh: The best tech gossip and reviews every week

How do you rate your interactions with chatbots? Share your experiences in the comments section below.