Follow us on Google News Follow Now!

Meta Introduces AI-Generated Content Labeling for Transparency on Instagram and Facebook

The Meta group will shortly be rolling out a tool capable of automatically flagging AI-generated or altered content on its social platforms.
Please wait 0 seconds...
Scroll Down and click on Go to Link for destination
Congrats! Link is Generated

On Meta's platforms, a label for AI-generated content

Following in the footsteps of other players, Meta is gradually developing methods to regulate the use of generative AI on its platforms. In a blog post published this Tuesday, February 6, the American group announces the introduction, "in the coming months", of a tool to automatically label AI-generated content on Facebook, Threads, and Instagram.

Meta Introduces AI-Generated Content Labeling for Transparency on Instagram and Facebook

To develop this solution, the company is announcing that it is "working with industry partners to establish common technical standards to signal that content has been generated by AI", details Nick Clegg, the Menlo Park-based group's dedicated president of international affairs.

Meta has already developed a similar tool for its image-generating AI. This announcement is hardly a surprise: since the US launch of Imagine with Meta, the group has already, "for reasons of transparency and traceability", deployed two watermarks on images produced by its image-generating AI, one invisible and the other apparent. From now on, however, the company aspires to design, together with other players in its sector, "a cutting-edge tool" enabling the automatic labeling of images from competing solutions such as Midjourney, Adobe Firefly, or DALL-E, developed by OpenAI.

Without detailing the precise roadmap, Meta hopes to have the tool in place as soon as possible, in anticipation of several major polls, including the US presidential elections, where generative AI could be exploited for disinformation purposes. "We're developing this tool right now, and soon we'll be applying labels in all the languages supported by the applications," he continues.

Generative AI tools offer immense possibilities, and we believe it's both possible and necessary for these technologies to be developed responsibly, concludes Nick Clegg.

Digitally created or modified" audio or video content will also be flagged

While the company seems to be able to identify the presence of AI in visual content, it admits to encountering difficulties in detecting these same signals on audio or video. To make up for this shortcoming, Meta has announced that it is working, in parallel, on adding a function enabling users to indicate when content has been designed with the help of artificial intelligence. This indication will be required on the group's platforms, and Meta reserves the right to apply sanctions if the broadcaster fails to specify that the published content has been "created or modified digitally".

This approach represents the pinnacle of what is currently technically possible. But it's not yet possible to identify all AI-generated content, Nick Clegg concedes in his blog post.

AI-generated content is a major challenge for social platforms

The proliferation of AI-generated content, particularly that which has been faked, is not just of concern to Meta. Last November, YouTube announced a series of measures to regulate the use of generative AI on its platform, including the introduction of specific wording to alert users that content viewed has been digitally generated or modified. Since September, TikTok has also imposed a statement indicating that content has been altered or created using artificial intelligence.

Categorised Posts

Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
AdBlock Detected!
We have detected that you are using adblocking plugin in your browser.
The revenue we earn by the advertisements is used to manage this website, we request you to whitelist our website in your adblocking plugin.
Site is Blocked
Sorry! This site is not available in your country.