If an artist were to make use of a piece of intellectual property owned by a large tech company, they risk facing legal action. Yet many creators are unhappy that those same tech companies are using their IP on a grand scale in the form of training material for generative AI. Can they fight back?
Perhaps now they can, with Nightshade, from a team at the University of Chicago. It’s a piece of software for Windows and MacOS that poisons an image with imperceptible shading, to make an AI classify it in an entirely different way than it appears.
The idea is that creators use it on their artwork, and leave it for unsuspecting AIs to assimilate. Their example is that a picture of a cow might be poisoned such that the AI sees it as a handbag, and if enough creators use the software the AI is forever poisoned to return a picture of a handbag when asked for one of a cow. If enough of these poisoned images are put online then the risks of an AI using an online image become too high, and the hope is that then AI companies would be forced to take the IP of their source material seriously.
For this to work it depends on enough creators taking up and using the software, but we are guessing that an inevitable result will be an arms race between AIs and image poisoners. One thing is certain though, as the AI hype has fueled such a growth in generative AI systems, creators, whether they be major publishers, your favourite human-generated tech news website, or someone drawing a cartoon strip in their bedroom, deserve not to have their work stolen in this way.
Creators Can Fight Back Against AI With Nightshade
Source: Manila Flash Report
0 Comments