Skip to Main Content

News & Events

An update on our commitment to fight violent extremist content online

By The YouTube Team

In June, we announced four steps we’re taking to combat terrorist content on YouTube:
  1. Better detection and faster removal powered by machine learning;
  2. More expert partners to help identify violative content;
  3. Tougher standards for videos that are controversial but do not violate our policies; and
  4. Amplified voices speaking out against hate and extremism.
We shared our progress across these steps in August and wanted to update you again on where things are today.

Better detection and faster removal

We’ve always used a mix of human flagging and human review together with technology to address controversial content on YouTube. In June, we introduced machine learning to flag violent extremism content and escalate it for human review. We continue to get faster here:
  • Over 83 percent of the videos we removed for violent extremism in the last month were taken down before receiving a single human flag, up 8 percentage points since August.
  • Our teams have manually reviewed over a million videos to improve this flagging technology by providing large volumes of training examples.
Inevitably, both humans and machines make mistakes, and as we have increased the volume of videos for review by our teams, we have made some errors. We know we can get better and we are committed to making sure our teams are taking action on the right content. We are working on ways to educate those who share video meant to document or expose violence on how to add necessary context.

More experts

Outside experts are essential to advising us on our policies and flagging content for additional inputs that better train our systems. Our partner NGOs bring expert knowledge of complex issues like hate speech, radicalization, and terrorism.

We have added 35 NGOs to our Trusted Flagger program, which is 70 percent of the way towards our goal. These new partner NGOs represent 20 different countries and include NGOs like the International Center for the Study of Radicalization at King’s College London and The Wahid Institute in Indonesia, which is dedicated to promoting religious freedom and tolerance.

Tougher standards

We started applying tougher treatment to videos that aren’t illegal and don’t violate our Guidelines, but contain controversial religious or supremacist content. These videos remain on YouTube, but they are behind a warning interstitial, aren’t recommended, monetized, and don’t have key features including comments, suggested videos, and likes. This is working as intended and helping us strike a balance between upholding free expression, by providing a historical record of content in the public interest, while also keeping these videos from being widely spread or recommended to others.

Amplify voices speaking out against hate and extremism

We continue to support programs that counter extremist messages. We are researching expansion for Jigsaw's Redirect Method to apply this model to new languages and search terms. We’re heavily investing in our YouTube Creators for Change program to support Creators who are using YouTube to tackle social issues and promote awareness, tolerance and empathy. Every month these Creators release exciting and engaging new videos and campaigns to counter hate and social divisiveness:
  • In September, three of our fellows, from Australia, the U.K., and the U.S., debuted their videos on the big screen at the Tribeca TV festival, tackling topics like racism, xenophobia, and experiences of first generation immigrants.
  • Local YouTube Creators in Indonesia partnered with the MAARIF Institute and YouTube Creators for Change Ambassador, Cameo Project, to visit ten different cities and train thousands of high school students on promoting tolerance and speaking out against hate speech and extremism.
  • We’re adding two new local Creators for Change chapters, in Israel and Spain, to the network of chapters around the world.
In addition to this work supporting voices to counter hate and extremism, last month Google.org announced a $5 million innovation fund to counter hate and extremism. This funding will support technology-driven solutions, as well as grassroots efforts like community youth projects that help build communities and promote resistance to radicalization.

Terrorist and violent extremist material should not be spread online. We will continue to heavily invest to fight the spread of this content, provide updates to governments, and collaborate with other companies through the Global Internet Forum to Counter Terrorism. There remains more to do so we look forward to continuing to share our progress with you.

The YouTube Team