NATIONWIDE — YouTube on Thursday unveiled new measures aimed at curbing dangerous conspiracy theories on its platform, specifically naming QAnon and its baseless “Pizzagate” theory as content that will be strictly prohibited. 


What You Need To Know

  • YouTube announced Thursday that it has updated its policies to further curb conspiracy content like QAnon

  • The update specifically targets content that targets individuals or groups to justify real-world violence

  • YouTube said the new policies will take effect immediately

  • Other social media companies like Twitter and Facebook have also taken steps to curb QAnon content on their platforms 

QAnon refers to the far-right conspiracy theory that paints President Donald Trump as a secret warrior against a supposed child-trafficking ring run by celebrities and “deep state” government officials. “Pizzagate” was a widely debunked theory that went viral during the 2016 presidential elections that claimed to link high-ranking Democrats to a child sex ring at a Washington, D.C. pizza restaurant.

Buzzfeed News recently announced that they would begin referring to Qanon as a "collective delusion," citing, among other reasons, the fact that the FBI has labeled it a domestic terror threat.

The announcement is an expansion on the company’s previous anti-hate policies, which YouTube updated nearly two years ago in an attempt to “ limit the reach of harmful misinformation” on the platform. The update specifically bans videos that target or harass individuals or groups. 

“Today we're further expanding both our hate and harassment policies to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence,” the company wrote in a statement on its blog. “One example would be content that threatens or harrasses someone by suggesting they are complicit in  one of these harmful conspiracies, such as QAnon or Pizzagate.” 

YouTube stopped short of banning QAnon content altogether, saying: “As always, context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up.” 

The company plans to implement the new plan beginning immediately, and will ramp up efforts to curb QAnon content in the coming weeks. The new policy continues YouTube’s practice of removing tens of thousands of QAnon videos and banning associated channels. 

YouTube’s updated policy follows suit of several other major social media companies that have taken recent action against QAnon. 

In early October, Facebook said it will ban groups that openly support QAnon. The company said that it will remove Facebook pages, groups and Instagram accounts for “representing QAnon” — even if they don’t promote violence. The social network said it will consider a variety of factors to decide if a group meets its criteria for a ban, including its name, the biography or “about” section of the page, and discussions within the page, group or Instagram account.

Mentions of QAnon in a group focused on a different subject won’t necessarily lead to a ban, Facebook said. Administrators of banned groups will have their personal accounts disabled as well.

Twitter said it has not banned QAnon from its site entirely, but says it doesn’t make QAnon tweets or accounts visible in searches or recommendations. The site’s hate speech policies do, however, call for suspension of accounts “engaged in violations of our multi-account policy, coordinating abuse around individual victims, or are attempting to evade a previous suspension,” a move specifically aimed at conspiracy theorist groups like QAnon.

Views of QAnon tweets have dropped by 50 percent since its new rules took hold in July. The company did not address questions about ads running on QAnon pages for the record.

The Associated Press contributed to this report.