YouTube Gently Relaxed Its Content Moderation Guidelines


A couple of weeks prior to Donald Trump’s second term inauguration, YouTube subtly adjusted its video moderation protocols, as per a report by the New York Times. The revised guidelines instruct moderators not to delete videos that breach YouTube’s policies—which forbid nudity, explicit violence, hate speech, and misleading misinformation—if considered to be in the public interest. Previously, videos were allowed to remain up if only 25% of their content violated the rules; now, up to 50% can contravene the guidelines and still be accessible, according to the Times.

YouTube identifies subjects such as “elections, ideologies, movements, race, gender, sexuality, abortion, immigration, censorship, and other matters” as public interest material, according to the Times, which examined the training documents outlining the modification.

In the middle of December, YouTube, a subsidiary of Alphabet (the parent company of Google), enacted this modification by revising moderator training documents. Likewise, Meta, the owner of Facebook and Instagram, ceased fact-checking social media posts in January, coinciding with Trump’s inauguration. Numerous Republicans, Trump included, have urged technology companies to lessen or remove content moderation. X terminated fact-checking subsequent to Elon Musk’s acquisition of the platform in 2022, choosing instead a crowdsourced Community Notes feature. Unlike X and Meta, which publicly declared their moderation alterations, YouTube opted not to.

The Times provided instances of YouTube’s refreshed policy, including a 43-minute video about Trump’s cabinet appointees that remained accessible despite containing a derogatory term against a transgender individual. Another instance involved a South Korean video that was not taken down even though it showcased a commentator discussing a speculative situation involving the execution of a politician via guillotine. YouTube justified keeping the video online, arguing that “execution by guillotine is not feasible.”

Upon inquiry for remarks, YouTube officials noted that 192,856 videos were discarded in the initial quarter of 2025, indicating a 22% rise from the previous year. A spokesperson for YouTube also pointed concerned users to their Community Guidelines Transparency Report for further details regarding the policy adjustment.

“We consistently refresh our Community Guidelines to respond to the content we observe on YouTube,” YouTube representative Nicole Bell stated in a communication to Mashable. “As examples, earlier this year, we eliminated our remaining COVID-19 guidelines and instituted new safeguards concerning gambling content. The New York Times article pertains to a different facet of our strategy: our enduring practice of implementing exceptions to our policies for content that is in the public interest or possesses EDSA (educational, documentary, scientific, artistic) significance. These exceptions apply to a minor percentage of videos on YouTube, yet are crucial for keeping important content accessible. This method permits us to avoid removing, for instance, an extensive news podcast for featuring a brief clip of violence. We regularly revise our guidance on these exceptions to accommodate the new kinds of discussions and content (like the emergence of long podcast content) that we encounter on the platform, along with the feedback from our global creator community. Our objective remains unchanged: to uphold free expression on YouTube.”