Microsoft's Tay Chatbot Trades Hate Speech For Olympic-Level Spamming In Floptastic Return

microsoft tay
Microsoft’s Tay chatbot made quite the splash last week when it was set loose on an unsuspecting Twitter audience, only to have her AI brain filled with bouts of racism, Nazi sympathizing, and a penchant for propositioning her followers for kinky sex. Microsoft took Tay offline roughly a day after her debut, and followed up with an apology for her vile behavior.

Well, Tay came back online this morning around 3:00am EST and while she didn’t proclaim her love for Donald Trump or Adolf Hitler — as she did previously — she instead went on a spamming spree after an apparent “mental breakdown.” Not long after her return, Tay entered into some sort of loop in which she repeatedly retweeted herself, in effect spamming everyone in the process.

More specifically, Tay kept repeating this phrase over and over again: “You are too fast, please take a rest." Tay even tried to atone for her spamming, apologizing to her followers by claiming, “I blame it on the alcohol.”

During the brief time this morning that Tay returned to Twitter, she managed to fire off a few thousand tweets. And she received her much-needed rest, as Microsoft once again took Tay offline. But this time around, Microsoft blamed Tay’s folly on human error.

"Tay remains offline while we make adjustments,” said a Microsoft’s spokesman in a statement. "As part of testing, she was inadvertently activated on Twitter for a brief period of time."

Another possibility is that Tay’s Twitter account was hacked, which is a working hypothesis by Security Data Scientist Russell Thomas:

It became immediately apparent that something was different and wrong.  These tweets didn't look anything like the ones before, in style, structure, or sentience.  From the tweet conversations and from the sequence of events, I believe that the @Tayandyou account was hacked today (March 30), and was active for 15 minutes, sending over 4,200 tweets.

We must say that Thomas’ evidence does look pretty compelling, considering Tay’s rather erratic behavior (for a bot). What sealed the deal for Thomas, however, was Tay’s last tweet before she was unceremoniously terminated for the second time. She posted an image of some binary code which translated into “u suck” in ascii text.

As Thomas writes, “Looks like the work of a hax0r to me, for the lulz.”