Industry News

By Ricardo Eloy

New tool lets artists "poison" their images against AI

A new tool called Nightshade allows artists to alter pixels in their artwork, making them harmful for AI training when scraped without permission.
STEPHANIE ARNETT/MITTR | REIJKSMUSEUM, ENVATO
Nightshade disrupts AI training by altering images in a way that's invisible to humans but confounds AI, making it learn incorrect associations. When integrated with another tool also developed by the same team called Glaze, it allows artists to upload their work online, masked with a different art style to prevent scraping by AI companies. 

Once an AI ingests these "poisoned" images, its performance degrades, e.g., identifying dogs as cats or cars as houses and so on. The tool aims to challenge the unethical scraping of artists' work for AI training, by potentially rendering the scraped data harmful for AI models, hence acting as a deterrent against copyright infringement.
COURTESY OF THE RESEARCHERS
An interesting aspect of these new tools is that the team behind them at the University of Chicago is making them open source, which means that the more people use them, the more powerful they'll get. In other words, the more "poisoned" images a model scrapes, the bigger the damage.

You can learn more about how this new tool works here.

Source: MIT Technology Review
You must be logged in to post a comment. Login here.

About this article

Nightshade is a tool that allows artists to alter pixels in their artwork, making them harmful for AI training when scraped without permission. The tool aims to combat unauthorized use of artists' work by AI firms.

visibility749
favorite_border1
mode_comment0
Report Abuse

About the author

Ricardo EloyVanguard

CGarchitect Editor/3D Specialist at Chaos

placeSão Paulo, BR