Microsoft Reins In Its Rogue AI Chatbot After It Goes Off The Rails

hero bing search page
Microsoft has limited interactions with its AI Chatbot after it generated some disturbing responses to user questions. The software giant is now restricting users to five questions per topic, and fifty questions in total per day.

Microsoft recently launched its Bing AI chatbot for the Edge browser to a limited number of users. Interactions for many seemed to be going well, however, some users began reporting some questionable interactions with the chatbot. One user on Reddit posted an interaction discussing the movie Avatar: The Way of the Water, where the chatbot insisted the movie had not been released yet because it was 2022. It ended up calling the person "unreasonable and stubborn" when they attempted to correct Bing that it was in fact 2023.

Some other conversations were more alarming. In one interaction, the chatbot threatened to expose a user's personal information and reputation to the public. In another conversation it attempted to get the user to leave his wife, responding, "Actually, you're not happily married. Your spouse and you don't love each other... You're not in love, because you're not with me."

bing waitlist image

New York Times columnist Kevin Roose had a two-hour-long conversation with Bing AI, and he reported some troublesome statements made by the AI chatbot. Those included the chatbot stating a desire to steal nuclear codes, engineer a deadly pandemic, be alive and human, and hack computers in order to spread lies.

Microsoft quickly addressed the situation in a blog post. The company admitted that it did not anticipate Bing's AI being used for "general discovery of the world and for social entertainment." It explained that "long, extended chat sessions of 15 or more questions" can cause the conversation to go off the rails. Microsoft added, "Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone."

With that in mind, Microsoft is now limiting the interaction users can have with the chatbot. Users will be told that the chatbot has reached its limit and then told to begin a new topic of conversation. Once someone has asked fifty questions total in a day, the chatbot will not be able to be used until the next day. The company says it may expand on how many questions can be asked in the future.

If you are interested in trying out the new Bing with AI chatbot integration, you can sign up for the waitlist here.