Photographers are claiming that Meta is tagging their real photos as “Made with AI.”
Meta introduced this automated tag to help people understand when what they’re seeing is real and when it’s been computer generated. But even AI itself isn’t good at telling what is or is not AI. So, users like a former White House photographer and a professional cricket team end up posting photos that are flagged as being AI-generated.
Former White House photographer Pete Souza told TechCrunch that he thinks this might be a result of basic photo editing that photographers do before uploading a photo. So, if you upload an image into Photoshop, crop it, and then export it as a new .JPG file, Instagram’s detectors might go off. This makes sense, since Instagram says its AI looks at metadata to detect the legitimacy of an image.
As election season approaches, it will become increasingly important that massive social platforms like Meta can adequately moderate AI-generated content. But as it stands, does a “Made with AI” tag really accomplish anything if it’s not reliable?