We cover 360 degree news
Digital, News, Technology

Meta Overhauls Fact-Checking for User-Centric Moderation

Meta Overhauls Fact-Checking for User-Centric Moderation

Meta Adopts a Community-Driven Approach

Meta, the tech giant behind Facebook and Instagram, has announced a major policy shift by discontinuing its U.S. fact-checking program. The company will now rely on a community-based system inspired by X’s “Community Notes” feature.

This marks a departure from Meta’s earlier focus on strict content moderation. CEO Mark Zuckerberg emphasized the need to reduce censorship errors, stating, “We are committed to restoring free expression while minimizing mistakes in content management.”

What’s Changing?

The revamped policy will:

  • Enable users to identify and add context to misleading posts.
  • Replace external fact-checkers with community input.
  • Move content policy teams from California to Texas and other U.S. hubs.

Meta’s automated systems will continue targeting high-severity violations, such as those related to terrorism and drug trafficking.

Stakeholders React to the New System

The decision has elicited mixed responses. Partner organizations like Check Your Fact expressed surprise, citing potential operational disruptions. Conversely, Meta’s Oversight Board welcomed the shift as a step toward transparency.

Critics argue the move may weaken the fight against misinformation. “Disinformation evolves rapidly, and this approach might be insufficient,” warned Ross Burley of the Centre for Information Resilience.

Meta vs. X: Competing Models of Moderation

This change positions Meta closer to X’s community-centric model, which has faced regulatory challenges in Europe. Meta plans to introduce “Community Notes” in the U.S., refining the system over time.

As one of the largest social platforms globally, Meta’s success or failure with this initiative could redefine content moderation standards worldwide.