Skip to Main Content

Managing harmful conspiracy theories on YouTube

Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence.

Managing misinformation and harmful conspiracy theories is challenging because the content is always shifting and evolving. To address this kind of content effectively, it’s critical that our teams continually review and update our policies and systems to reflect the frequent changes. Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence. This builds on our work over the last several years to strengthen and evolve our policies and enforcement — work that has been organized around four pillars: removing violative content, reducing the spread of harmful misinformation, raising authoritative voices, and rewarding trusted creators.

In fact, when we looked at QAnon content, we saw the number of views that come from non-subscribed recommendations to prominent Q-related channels dropped by over 80% since January 2019.”

Nearly two years ago, we took a major step to limit the reach of harmful misinformation by updating our recommendations system. This resulted in a 70% drop in views coming from our search and discovery systems. In fact, when we looked at QAnon content, we saw the number of views that come from non-subscribed recommendations to prominent Q-related channels dropped by over 80% since January 2019.

Additionally, we’ve removed tens of thousands of QAnon-videos and terminated hundreds of channels under our existing policies, particularly those that explicitly threaten violence or deny the existence of major violent events. All of this work has been pivotal in curbing the reach of harmful conspiracies, but there’s even more we can do to address certain conspiracy theories that are used to justify real-world violence, like QAnon.

Today we're further expanding both our hate and harassment policies to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence. One example would be content that threatens or harasses someone by suggesting they are complicit in  one of these harmful conspiracies, such as QAnon or Pizzagate. As always, context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up. We will begin enforcing this updated policy today, and will ramp up in the weeks to come. 

Due to the evolving nature and shifting tactics of groups promoting these conspiracy theories, we’ll continue to adapt our policies to stay current and remain committed to taking the steps needed to live up to this responsibility.

Subscribe