Meta Transitions from Fact-Checking to Community Notes in Anticipation of Possible Second Trump Term


In a video posted on his Facebook page on Tuesday, Mark Zuckerberg, the CEO of Meta, declared that the platform will discontinue the use of fact-checkers.

“It’s time to reconnect with our foundational principles of free expression,” Zuckerberg mentioned in the video’s caption. “We’re substituting fact-checkers with Community Notes, streamlining our policies, and concentrating on minimizing errors. Looking forward to this new chapter.”

Moreover, Meta is undoing earlier adjustments that had restricted the volume of political content visible in users’ feeds.

“Fact-checkers have shown excessive political bias and have damaged more trust than they’ve gained,” Zuckerberg noted. “What started as a move to promote inclusivity has gradually been twisted to stifle differing views and exclude individuals with contrasting perspectives. It’s gone too far.”

Zuckerberg recognized the “tradeoff” that comes with promoting political postings while doing away with fact-checkers. He positioned the change as a stand against censorship, although critics contend it raises issues regarding unchecked misinformation. Berin Szóka, President of TechFreedom, previously stated to *Mashable*, “Censorship is merely content moderation that someone finds objectionable.”

This shift has ignited considerable discussion among experts and advocacy organizations. Nora Benavidez, Senior Counsel and Director of Digital Justice and Civil Rights at Free Press, reproached the decision in a press release, asserting, “Content moderation has never aimed to stifle free speech; it was created by platforms to encourage dialogue and uphold truth for users.” She further contended that Meta’s fresh stance prioritizes evading responsibility for user safety and aligns the company with political entities that resist accountability.

The Real Facebook Oversight Board, an autonomous body established to critique Meta’s policies, also denounced the decision. In a statement, Ben Wyskida from the group characterized the move as “a withdrawal from any rational and secure strategy to content moderation.” He cautioned that Meta’s actions, along with comparable changes at Twitter under Elon Musk, represent a precarious shift that could lead to unchecked propaganda and misinformation.

Critics have highlighted similarities between Zuckerberg’s decision and Musk’s tactics at Twitter (now X), where the removal of content moderation tools resulted in a spike in misinformation. Worries are escalating that Meta’s platforms, already associated with the dissemination of conspiracy theories, violence, and radicalization in areas like India and Myanmar, might suffer analogous outcomes.

As Meta ventures into this new phase, the choice to abolish fact-checkers and relax restrictions on political content has triggered fears of a more extensive decline in trust and accountability within the digital realm.