University of Chicago researchers seek to “poison” AI art generators with Nightshade

Enlarge (credit: Getty Images)

On Friday, a team of researchers at the University of Chicago released a research paper outlining "Nightshade," a data poisoning technique aimed at disrupting the training process for AI models, reports MIT Technology Review and VentureBeat.

Nightshade ‘poisons’ AI models to fight copyright theft

University of Chicago researchers have unveiled Nightshade, a tool designed to disrupt AI models attempting to learn from artistic imagery.

The tool – still in its developmental phase – allows artists to protect their work by subtly altering pixels in …

文 » A