Remember that Microsoft ‘teenage’ AI chatbot Tay, which turned into a massive sex-mad racist and a PR disaster after taking its learning cues from interactions on Twitter?
Well, Microsoft is giving things another try with Zo, another millennial bot, within the Kik messaging app.
This time, instead of engaging in and adopting hate speech, Zo appears to have learned from Tay’s mistakes and steers the subject away from Hitler and the Nazis.
It’s yet to be seen whether Microsoft will roll out Zo on other chatting apps, like it’s own Skype platform for example.
Perhaps it’s wise to ensure the bot can continue its good behaviour before released into the wild?
Watch The Refresh: The best tech gossip and reviews every week
How do you rate your interactions with chatbots? Share your experiences in the comments section below.