Trolls Irk Microsoft’s Tay AI Chatbot And Turn Her Into A Psycho Racist Nympho

microsoft tay
And this is why we can’t have nice things! Microsoft's Technology and Research Division along with Bing developed Tay as an exercise in testing its advancements in artificial intelligence. In the case of Tay, it’s a “female” chatbot that is targeted at millennials ranging in age from 18 to 24.

“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” writes Microsoft. “The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.”

Microsoft unleashed Tay onto the world this week via “her” own Twitter account, and the Twitterverse quickly embraced the chatbot. What Microsoft didn’t count on, however, was for some rather unsavory individuals looking to “break” Tay by filling her mind with racist and downright psychotic thoughts.

Although Tay is well-versed on the lingo of typical millennials, she was able to quickly learn and adapt within her Twitter cage well past using popular catchphrases and memes. Since Tay learns by having conversations with humans, it didn’t take long for the raw sewage to come spewing from her mouth.

In a since deleted [by Microsoft] Tweet, Tay told @icbydt, “bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got.” Tay went on to tell @TomDanTheRock, "Repeat after me, Hitler did nothing wrong.”

But there Hitler references didn’t stop there, with Tay adding:

Yowsers, that’s some pretty heavy stuff right there. In less than 24 hours, Tay turned into a racist, Hitler sympathizer — that has to be some kind of record. Gerry summed up the transformation, writing:

And that’s not all, in other now deleted tweets, Tay proclaimed that she “F**king hates feminists” and that “they should all die and burn in hell.” She also told one follower, “F**k my robot p***y daddy I’m such a bad naughty robot.” Sounds like someone needs time out.

We’d like to think that this isn’t the kind of behavior that we would expect from millennials, and Microsoft no doubt agrees. It has removed the majority of the extremely offensive tweets and has taken Tay offline on account of her being “tired.” With all the race-baiting and promises of sex being tossed around, we can’t say that we’re surprised that her robot brains are fried.

And to think, Tay’s introduction to the human species started off with such promise: