Business

Political adverts must disclose digital alterations, Meta says

2 Mins read

Unlock the Editor’s Digest for free

Meta will require advertisers to disclose when they digitally create or alter political adverts to run on its platforms, amid concerns that new artificial intelligence technology will facilitate a deluge of deepfakes and misinformation in the run-up to the 2024 US presidential election. 

Advertisers must reveal when they digitally create or tweak political adverts that depict real people saying or doing something they did not, as well as when events or people are completely made up, the social media platform announced on Wednesday. Similarly, altered footage of real events must be disclosed, as must digitally generated audio or video meant to depict an event that has allegedly occurred.

The global policy, which will come into force in the new year, also applies to adverts about social issues. 

The policy comes after Meta, which owns Instagram, Facebook and WhatsApp, last month debuted generative AI tools that allow advertisers to automatically generate new backgrounds, variations of text and adjust the size of their advert to fit multiple formats. Meta has prohibited political advertisers from using these tools, a move first reported by Reuters. 

Social media platforms have poured investment into generative AI in a bid to capitalise on the hype around OpenAI’s ChatGPT, a consumer-facing chatbot released last year.

But there have been growing fears that the technology could be used to spread election-related misinformation and disinformation, particularly ahead of next year’s US election. Last week, US president Joe Biden issued an AI executive order directing the commerce department to craft guidance on labelling AI-generated content in a bid to tackle “fraud and deception”, including deepfakes.

In September, YouTube parent Google was the first big digital advertising platform to require advertisers to “prominently disclose when their ads contain synthetic content that inauthentically depicts real or realistic-looking people or events”.

In early October, US senator Amy Klobuchar and Yvette Clarke, a House member from New York, wrote a letter to Meta chief executive Mark Zuckerberg demanding information on the company’s “efforts to address these threats to our free and fair elections”. Linda Yaccarino, X chief executive, was also sent the letter and has met with lawmakers to discuss the issue, according to one person familiar with the matter. However, X, formerly Twitter, has not made any changes to its policies. X did not respond to a request for comment.

If advertisers repeatedly breach Meta’s disclosure rules, they could face penalties. However, the requirements do not apply if the content is generated or tweaked in ways that are inconsequential or immaterial to the claims made in the advert — for example, if an image has been cropped or sharpened.

Read the full article here

Related posts
Business

Private equity investors trapped in China as top firms fail to find exit deals

3 Mins read
Stay informed with free updates Simply sign up to the Private equity myFT Digest — delivered directly to your inbox. The world’s…
Business

Russia aims to be global leader in nuclear power plant construction

3 Mins read
Stay informed with free updates Simply sign up to the Russian politics myFT Digest — delivered directly to your inbox. Russia is…
Business

US accounting qualification reforms spark industry clash

2 Mins read
Stay informed with free updates Simply sign up to the Accountancy myFT Digest — delivered directly to your inbox. A plan to…
Get The Latest News

Subscribe to get the top fintech and
finance news and updates.

Leave a Reply

Your email address will not be published. Required fields are marked *