Artists Are Using Nightshade Data Poisoning Tool In Escalating War With Generative AI
Artists are battling to protect their work against being stolen by generative AI, with some going as far as launching lawsuits. With OpenAI recently opening up its ChatGPT to the entire internet, the battle will likely only escalate. However, artists can use a new tool to protect their valuable work, called Nightshade.

Example of AI-generated art.
The developers of Nightshade say it is "an optimized prompt-specific poisoning attack where poison samples look visually identical to benign images with matching text prompts." The samples can also corrupt a Stable Diffusion SKXL prompt in <100 poison samples. The creators of Nightshade add that it should only be implemented as a last defense against web scrapers that ignore opt-out/do-not-crawl directives.
Ben Zhao, a professor at the University of Chicago who led the team that created Nightshade, remarked that he hoped the new technology would help tip the balance of power back to the artists. According to an article by MIT Technology Review, his team is also responsible for creating Glaze. This tool allows artists to "mask" their own personal style to prevent it from being scraped by AI companies.
Junfeng Yang, a computer science professor at Columbia University, remarked that Nightshade will make AI companies think twice, because of the possibility of "destroying their entire model" by taking an artist's work without consent.