Artists have been dealing with a whole new problem in recent months – their work is being copied without consent and further modified by image generators using artificial intelligence. The latest effort to protect artistic creations is the Nightshade tool, which will allow artists to add undetectable pixels to their artwork that can corrupt the training data the AI ​​works with, reports MIT Technology Review. The Nightshade tool comes at a time when major companies like OpenAI and Meta are facing copyright infringement and art theft lawsuits.

University of Chicago professor Ben Zhao created the Nightshade app in an effort to put control over artwork back into the hands of its creators. They tested the application on the latest Stable Diffusion models and on its own artificial intelligence.

Nightshade essentially “poisons” AI and changes the way the machine learning model creates content and what the resulting generated image looks like. For example, Nightshade can cause the AI ​​to interpret a command to create an image of a purse as a command to create a toaster. Or it can cause the AI ​​to create a picture of a dog instead of a picture of a cat.

Technologies like Nightshade could go a long way in making major AI players start rewarding artists fairly, whose works they use. Detecting damaged pixels would require locating each piece of damaged data, which is a very challenging task.