“Google says that it plans to roll out changes to Google Search to make clearer which images in results were AI generated — or edited by AI tools.

In the next few months, Google will begin to flag AI-generated and -edited images in the ‘About this image’ window on Search, Google Lens, and the Circle to Search feature on Android. Similar disclosures may make their way to other Google properties, like YouTube, in the future; Google says it’ll have more to share on that later this year.

Crucially, only images containing ‘C2PA metadata’ will be flagged as AI-manipulated in Search. C2PA, short for Coalition for Content Provenance and Authenticity, is a group developing technical standards to trace an image’s history, including the equipment and software used to capture and/or create it.

Companies, including Google, Amazon, Microsoft, OpenAI, and Adobe, back C2PA. But the coalition’s standards haven’t seen widespread adoption. As The Verge noted in a recent piece, the C2PA faces plenty of adoption and interoperability challenges; only a handful of generative AI tools and cameras from Leica and Sony support the group’s specs.”

From TechCrunch.