@ 2024 Advocate Channel.
All Rights reserved

Meta Is Allowing Political Ads That Claim the 2020 Election Was Stolen

Meta Is Allowing Political Ads That Claim the 2020 Election Was Stolen
Koshiro K / Lyonstock / Shutterstock

Meta will allow political ads that question the outcome of the 2020 U.S. presidential election on its platforms as candidates run for president in 2024.

Meta will allow political ads that question the outcome of the 2020 U.S. presidential election on its platforms as candidates run for president in 2024.

Meta will allow political ads that question the outcome of the 2020 U.S. presidential election on its platforms, the company has announced.

This move is part of a decrease in election-related content moderation among major social media platforms over the past year ahead of the 2024 US presidential election.

Following the January 6, 2021 attack on the U.S. Capitol, which was fueled by baseless claims about 2020 election fraud, pressure on tech companies to combat election misinformation increased.

There have been dozens of lawsuits that have attempted to challenge the 2020 presidential election results but they were dismissed at the state and federal levels across the country.

However, Meta, the parent company of Facebook and Instagram, will now be able to directly profit from political ads that promote fictitious claims about the legitimacy of the 2020 election because of the policy.

Although the company will allow for political advertisements to claim past elections, including the 2020 election, were rigged, it will not allow ads that “call into question the legitimacy of an upcoming or ongoing election.”

Meta In Hot Water

The policy was part of an August 2022 announcement pertaining to its approach to last year’s midterm elections, when the company stated that it would prohibit ads that discourage people from voting, question the legitimacy of an upcoming or ongoing election, or prematurely claim an election victory to target users in the United States, Brazil, Israel and Italy.

In addition, Meta as well as YouTube and X, formerly known as Twitter, have re-instated former US President Donald Trump’s accounts since last fall. After reinstating Trump’s account in January, Meta stated that it would not punish the former presidents for attacking the 2020 election results but assured it will not allow him to call into question any upcoming elections.

In June, YouTube reversed the policy put in place over two years ago when it made the decision to no longer remove content that features false claims regarding the 2020 election being stolen. However, the company is still prohibiting content that wrongly informs users of how and when to vote and pushes false claims that possibly discourage voting or otherwise “encourage others to interfere with democratic processes.”

This policy change, however, does not apply to its ad policies, YouTube spokesperson Michael Aciman confirmed to CNN on Wednesday. YouTube’s ad policy will continue to prohibit claims that are false and could sabotage participation or trust in the electoral and democratic processes.

Meta also said earlier this month that starting next year it will be requiring political advertisers across the globe to disclose if artificial intelligence was used in their ads. This part of a movement to remove “deepfakes” as well as other digitally altered misleading content.

The company will also prohibit political advertisements from using the company’s new artificial intelligence tools that help brands generate text, backgrounds, and other marketing content.

From our sponsors

From our partners

Top Stories

Kylie Werner