News

MIT’s “PhotoGuard” Protects Images From Unauthorized AI Edits

The technology invisibly alters select pixels to throw off algorithmic AI models.

Published

on

MIT CSAIL

As AI continues to develop rapidly, chatbots are gaining the power to create and manipulate images, with Shutterstock and Adobe currently leading the way. Despite the obvious power of AI algorithms, the technology has a few pitfalls, one of which is the unauthorized manipulation of copyrighted artwork and images.

MIT CSAIL thinks it has the answer to this growing problem in the form of PhotoGuard, a new technique that alters select pixels in an image to disrupt AI’s ability to understand what the image is.

The altered pixels are known as “perturbations” and are invisible to the human eye but easily seen by AI bots as they scan the color and position of every pixel in an image. Any edits AI tries to make to a protected image will also apply to the fake pixels, resulting in an unrealistic or broken final image, thanks to PhotoGuard.

Also Read: PicSo Review: A Popular AI-Based Text-To-Image App

“The encoder attack makes the model think that the input image is some other image,” explained MIT student and lead author of the paper, Hadi Salman. “Whereas the diffusion attack forces the diffusion model to make edits towards some target image”. The technique sounds complex but could potentially stop malicious actors from reverse engineering protected images by adding minor edits to circumvent copyright.

“A collaborative approach involving model developers, social media platforms, and policymakers presents a robust defense against unauthorized image manipulation. Working on this pressing issue is of paramount importance today,” Salman said in a recent press release. “And while I am glad to contribute towards this solution, much work is needed to make this protection practical. Companies that develop these models need to invest in engineering robust immunizations against the possible threats posed by these AI tools”.

Leave a Reply

Your email address will not be published. Required fields are marked *

#Trending

Exit mobile version