With generative AI tools like Midjourney, Stable Diffusion and Dall-E fueling an onslaught of images created from text prompts, a growing number of artists have expressed concern that their work is getting scraped from the internet to train AI models without permission, credit or compensation.
Enter Nightshade, a new tool out of the University of Chicago that aims to help artists safeguard their work before they upload it online, where it could get ingested into AI training sets. The tool protects pixels by “poisoning” digital images with subtle, invisible changes that cause AI algorithms to misinterpret them.
An artist, for example, might draw a picture that’s clearly a flower to absolutely anyone who looks at it. With Nightshade applied, AI will recognize the image as something altogether different, like a truck, corrupting AI’s accuracy.
“Power asymmetry between AI companies and content owners is ridiculous,” the team behind the tool said on Twitter, now X.
Some artists, to be sure, say they’re excited about AI’s creative possibilities, but others fear it jeopardizes their livelihood. Polish digital artist Greg Rutkowski doesn’t blame those who use his name as a text prompt. For them, “it’s a cool experiment,” he told MIT Technology Review. “But for me and many other artists, it’s starting to look like a threat to our careers.”
Datasets for large AI models can consist of billions of images, so more poisoned images scraped into a model mean less accuracy, notes the publication, which introduced Nightshade to the world this week.
OpenAI, maker of ChatGPT and Dall-E, did not immediately respond to a request for comment.
A team led by Ben Zhao, a University of Chicago computer science professor, created Nightshade with input from artists. The researchers detail their work in a paper that’s been submitted for peer review.
Nightshade, Zhao said in an email interview, should provide a strong disincentive for AI companies that train their algorithms on images scraped from anywhere online.
The system “is meant to affect model trainers who disregard copyright, opt-out lists and do-not-scrape directives,” he added. “Ethical companies should have nothing to worry about from Nightshade.”
The tool hasn’t yet been released in any form.
That, however, did not stop a company called Turing’s Solutions, which offers consulting on AI-related matters, from publishing an online article about an “antidote” to Nightshade a day after news of it broke.
“As AI models continue to ingest vast amounts of online data, there is a risk of poisoning attacks—malicious data designed to mislead and corrupt AI systems,” reads the post. “The answer,” the post continues, “is not more poisoning but instead releasing ‘antidotes’ to detect and filter out poisoned content.”
Zhao was quick to debunk the existence of the countermeasure, starting with the fact that Nightshade isn’t out yet.
“You can be certain anyone offering an ‘antidote’ is lying,” he wrote on X. “You can’t test, much less ‘cure’ what you don’t have.”
‘More Tools To Fight Back’
In March, the University of Chicago released a similar AI-thwarting product called Glaze “to protect artists against invasive AI” by essentially masking their personal style. Time magazine this week named Glaze as one the best inventions of 2023. But Nightshade takes Glaze’s capabilities a step further.
“Glaze only protects artists against models that fine tune their work,” Zhao said. “It does nothing to prevent their work from being scraped without their consent and used to train base models.”
Nightshade might be integrated into Glaze as an optional enhancement, and its creators say they might also release an open-source version.
But with AI advancing as quickly as it is, not everyone’s convinced Nightshade will be able to outrun the technology, or even that it should.
“Why poison your art for one generation of AI? It may be the only thing that carries your art outside of and long after humanity,” one X user wrote in a thread responding to the Nightshade team.
Many artists, however, thanked the team for its innovation and asked when they could start using Nightshade to poison their own work.
“Not all heroes wear capes,” one X user wrote. “Glad that the real artists are getting more tools to fight back.”