Artists 'Poison' AI Image Tools

John Lister's picture

Disgruntled artists have exploited a flaw in artificial intelligence image generators to "poison" their learning. They are looking for revenge on AI operators who use artwork without permission to train their tools.

Like most generative AI, image tools learn by analyzing millions of pictures and accompanying captions. In turn, the AI creates its own checklist for what particular words or phrases mean. They also learn styles of image and art.

While some AI tools are "trained" with permission on libraries of licenses of images, others simply trawl the Internet and use any images they can find. That's created a legal and ethical debate about whether such activity is plagiarism or simply using art as inspiration in the same way a human painter might produce something in the style of a particular artist.

Nightshade Is Deadly

The artists are now fighting back with a tool called Nightshade. It makes it possible to alter an image by changing a limited number of pixels. A human who sees the image will either not spot the alterations or will mentally correct for them, meaning they still see and recognize the image in its "correct" form. (Source:

However, the altered image will muddy the waters in the AI model's understanding of what particular things and styles look like. This effect is compounded as the model sees more images with the alteration. The creators say that a model could see fewer than 100 "poisoned" images before its understanding is corrupted and it is no longer able to consistently create "correct" images for a particular term. (Source:

The creators say Nightshade should be a last line of defense for image creators. They argue it is justified in cases where AI tools ignore a "do not crawl" signal in a web pages code that asks computer tools not to automatically collect information.

Ethical Dispute

The debate looks set to run. One argument against such tools is the idea that anything on the Internet is publicly available and thus fair game, in the same way as art students can study paintings on public display. Another is that such tools reduce the benefits of AI image generation.

One counter-argument is that tools can still work well when trained on licensed image databases, which have enough variety to provide the necessary data. For example, Adobe is satisfied its AI tool got enough information by analyzing the Getty and Shutterstock image libraries.

What's Your Opinion?

Are the artists right to be upset? Is "Nightshade" an appropriate response? Do you expect to see more efforts to disrupt AI tools in the future?

Rate this article: 
Average: 5 (8 votes)


buzzallnight's picture

Ai should be smart enough to figure this out...

doctordemando's picture

Show Hugh the unsolvable geometric formula to poison the borg FTW.

anniew's picture

Apparently AI is not smart enough to figure it out, or someone intentionally changes certain art. I just read a long article this morning pointing out all the theological mistakes in a picture of St. Michael the Archangel, AI generated. (Any Christian knows this angel from the Bible, and would know of the symbolism described in the article.) So there are 2 options, as stated in my opening sentence.