Meta announces ending fact-checking program in the US

The platform will shift to a community-based system, Community Notes, where users can contribute to rating and providing context for content

Meta announces ending fact-checking program in the US

Meta announces ending fact-checking program in the US

The platform will shift to a community-based system, Community Notes, where users can contribute to rating and providing context for content

Meta has taken significant steps to redefine how it handles content moderation across its platforms, including Facebook, Instagram, and Threads. The company on Tuesday announced the end of its third-party fact-checking program in the US and will shift to a community-based system, Community Notes, where users can contribute to rating and providing context for content.

These changes are an attempt to return to the commitment to free expression that Mark Zuckerberg set out in his Georgetown speech.

"We’ve reached a point where it’s just too many mistakes and too much censorship. Even if we accidentally censor just one percent of posts, that’s millions of people," he said.

Here are the key points of the new policy overhaul:

Community Notes to Replace Fact-Checkers

Meta plans to phase out fact-checkers and replace them with a community-driven system called "Community Notes." The move is a response to the growing distrust in traditional fact-checking mechanisms, which Meta admits have been politically biased in the past.

"The fact checkers have just been too politically biased and have destroyed more trust than they’ve created," Zuckerberg said. "So, over the next couple of months, we’re going to phase in a more comprehensive community notes system."

Simplification of Content Policies

Meta will simplify its content policies and remove restrictions on sensitive topics such as immigration and gender, which the company believes have become overly restrictive and out of touch with mainstream discourse.

"What started as a movement to be more inclusive has increasingly been used to shut down opinions and shut out people with different ideas," Zuckerberg said. "We want to make sure that people can share their beliefs and experiences on our platforms."

Adjusting Content Moderation Filters 

In a move aimed at reducing unnecessary censorship, Meta will scale back its filters for less severe policy violations. Instead, they will rely on user reports to flag content before taking action. This shift is designed to reduce the number of innocent posts mistakenly removed.

"We used to have filters that scanned for any policy violation. Now, we’re going to focus those filters on tackling illegal and high-severity violations," Zuckerberg explained. "By dialing them back, we’re going to dramatically reduce the amount of censorship on our platforms."

Reintroducing Civic Content

After a period of limiting political posts to address user stress, Meta has decided to bring back more civic and political content, responding to feedback from users who want to engage with political discussions again.

"It feels like we’re in a new era now, and we’re starting to get feedback that people want to see this content again," Zuckerberg added. "We’re going to start phasing this back into Facebook, Instagram, and Threads while working to keep the communities friendly and positive."

Relocation of Trust and Safety Teams

Meta will relocate its trust and safety and content moderation teams from California to Texas, with the goal of creating a new environment for content review and moderation that better aligns with the company’s updated approach.

"Our U.S.-based content review will be based in Texas moving forward," said Zuckerberg. "This shift will help streamline our operations and improve the overall process.