Connect with us

Internet

Google to Mandate Disclosures for AI Creations on Political Advertisements by November: Details

Avatar

Published

on

Manipulating digital media to deceive or mislead people about politics, social issues, or matters of public concern is already banned by Google.
By Agence France-Presse | Updated: 7 September 2023

Google on Wednesday said it will mandate that political advertisements on its platforms disclose when images and audio have been altered or created using tools such as artificial intelligence (AI). The change to Google’s ad policy is to take effect in November, about a year ahead of what is likely to be a contentious US presidential election and as fears mount that generative AI will be used to mislead voters.

“For years we’ve provided additional levels of transparency for election ads,” a Google spokesperson said in response to an AFP query. “Given the growing prevalence of tools that produce synthetic content, we’re expanding our policies a step further to require advertisers to disclose when their election ads include material that’s been digitally altered or generated.”

In June, a Ron DeSantis campaign video attacking former US President Donald Trump featured images bearings markings of having been created using AI, an AFP Fact Check team determined.

The video shared in a tweet at X, formerly known as Twitter, contained photos that appeared altered to show Trump embracing Anthony Fauci, a key member of the US coronavirus task force, with kisses on the cheek, according to AFP Fact Check. Google’s ad policies already ban manipulating digital media to deceive or mislead people about politics, social issues, or matters of public concern.

Demonstrably false claims that could undermine participation or trust in the election process are also forbidden at Google, according to the internet giant’s ad policy. Google requires political ads to disclose who paid for them, and makes information about the messages available in an online ads library.

The coming update will require election-related ads to “prominently disclose” if they contain “synthetic content” that depicts real or realistic-looking people or events, according to Google. The tech titan said it continues to invest in technology to detect and remove such content.

Disclosures of digitally altered content in election ads must be “clear and conspicuous,” and put where they are likely to be noticed, according to Google. Examples of what would warrant a label included synthetic imagery or audio showing a person saying or doing something they did not do, or depicting an event that did not occur.

Google suggested labels such as “This image does not depict real events” or “This video content was synthetically generated.”