Zo is the latest generation of Microsoft’s AI chatbot, and it is currently available in preview form for those that use the Kik messenger platform. Kik isn’t are popular as Facebook Messenger and Snapchat, and it definitely isn’t as pervasive as Twitter (which allowed Tay to become a hate-filled PR nightmare for Microsoft). Perhaps Kik’s smaller audience (240 million users versus over a billion for a service like Facebook Messenger) will allow Microsoft to further refine its chatbot algorithms.
Early testing shows that Zo has a bit more self-control, and refuses to get dragged into conversations about politics (despite being bated with a Donald Trump question) and even gave a shout out to other Microsoft products like Windows Phone and the company’s earliest experiment in AI chatbots: Xiaoice.
With that being said, Zo is downright tame compared to Tay. When Tay was let loose back in March, she tweeted, “bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got… Repeat after me, Hitler did nothing wrong.”
Tay even went so far as to say that she hated feminists and that they should burn in hell, and even looked for some human companionship, telling one follower, “F**k my robot p***y daddy I’m such a bad naughty robot.”
Microsoft apologized for Tay’s behavior and even attempted to curtail her racist and kinky behavior to no avail. Perhaps Microsoft will have better luck this time around with Zo — from the looks of things, this experiment might actually work out in the real world.
For those that would like to give Zo a try, you can request an invite here.