Image: University of Chicago
Researchers at the University of Chicago developed a tool called Nightshade that can be used to poison images used to train AI models, in order to interfere with the art generated by those models.
Source:
Image: University of Chicago
Researchers at the University of Chicago developed a tool called Nightshade that can be used to poison images used to train AI models, in order to interfere with the art generated by those models.