A new tool created by a team at the University of Chicago can add undetectable pixels to images to help creatives protect their work from AI image generators by effectively poisoning (corrupting) the AI’s training data.

A new tool created by a team at the University of Chicago can add undetectable pixels to images to help creatives protect their work from AI image generators by effectively poisoning (corrupting) the AI’s training data.