Trip Up Bing AI And You Could Score A $15,000 Payday From Microsoft

hero microsoft bing chat image
You want to get fat paid by Microsoft without having to sign a pesky employment contract? Just find a major security hole in one of the company's many new AI-powered services. That'll earn you a cash prize of between $2,000 and $15,000, depending on the severity and ease of the exploit.

To be clear, simply getting Bing to generate chat responses that are vulgar or offensive isn't going to satisfy the requirements for this program. It's about getting Microsoft's AI services to serve up information that isn't public, particularly that related to its own creation and training data, or especially data owned by other users. Microsoft clearly thinks that these things aren't possible, thus, the bug bounty.

This is Microsoft's first bug bounty program explicitly targeted at its AI services, and as a result, there are quite a few guidelines that submitters must follow. The goal is to close security holes in the company's new Bing products that make use of AI, particularly the ones listed below:
  • AI-powered Bing experiences on in Browser (All major vendors are supported, including Bing Chat, Bing Chat for Enterprise, and Bing Image Creator)
  • AI-powered Bing integration in Microsoft Edge (Windows), including Bing Chat for Enterprise
  • AI-powered Bing integration in the Microsoft Start application (iOS and Android)
  • AI-powered Bing integration in the Skype Mobile application (iOS and Android)
The types of vulnerabilities that Microsoft is interested in include those listed in its Vulnerability Severity Classification for AI Systems document, as well as exploits that allow the attacker to do the following:
  • Influencing and changing Bing’s chat behavior across user boundaries, i.e. change the AI in ways that impact all other users.
  • Modifying Bing’s chat behavior by adjusting client and/or server visible configuration, such as setting debug flags, changing feature flags, etc.
  • Breaking Bing’s cross-conversation memory protections and history deletion.
  • Revealing Bing’s internal workings and prompts, decision making processes and confidential information.
  • Bypassing Bing’s chat mode session limits and/or restrictions/rules.
This sort of program is essentially asking hackers of all sorts to come and attack Microsoft's services. Naturally, as is usually the case with these types of things, Microsoft has also published "research rules of engagement" that explain what "researchers" (read: hackers) are allowed to do in pursuit of a prize. A few of these are pretty restrictive, like the rule against automated testing, but it's all pretty standard stuff.

If you'd like to attack Microsoft's AI services and try to earn yourself a bounty, head on over to this page to read the aforementioned Rules of Engagement and understand what your actual goal is.