Meta’s Fact-Checking Initiative Scheduled to Conclude on Monday


Beginning Monday, Meta will officially conclude its fact-checking initiative in the United States across all its platforms, which include Facebook, Instagram, and Threads. This announcement was made by Joel Kaplan, Meta’s newly appointed Chief of Global Policy, who verified the change in a post on X (previously Twitter) on Friday.

“By Monday afternoon, our fact-checking initiative in the US will be officially concluded. That means no new fact checks and no fact checkers,” Kaplan stated. “Instead of fact checks, the first Community Notes will gradually start appearing across Facebook, Threads & Instagram, without any penalties involved.”

This change is part of a larger policy adjustment initially announced in early January by Meta CEO Mark Zuckerberg. At that time, Zuckerberg disclosed that the company would be substituting its third-party fact-checking activities with a crowd-sourced system modeled after X’s Community Notes. He defended the decision by criticizing traditional fact-checkers as being “too politically biased,” asserting that they had “destroyed more trust than they’ve established.”

Meta indicates that this transition aims to foster free speech and minimize political censorship. Nonetheless, critics, including civil rights advocates and digital policy specialists, caution that this alteration could unleash unchecked misinformation, propaganda, and disinformation—problems that already afflict the platform. They reference X’s situation under Elon Musk, where a comparable system has coincided with an increase in both misinformation and hate speech.

Meta has commenced beta testing Community Notes and is encouraging users to join as contributors. To qualify, users must be at least 18 years old, possess an account that is over six months old, and maintain “good standing” with the platform.

Even with the introduction of Community Notes, Meta has confirmed that the system will not extend to paid advertisements. This implies that users can still promote misleading or offensive content—provided they are willing to pay for it.

The cessation of fact-checking is merely one of several recent policy reversals at Meta. Around the same time, the company also dismantled its Diversity, Equity, and Inclusion (DEI) programs and relaxed its content moderation guidelines concerning hate speech, raising additional concerns regarding the platform’s dedication to responsible content governance.