Skip to Main Content

Responsible policy enforcement during Covid-19

Today, we’re releasing full Q2 data in our quarterly Community Guidelines Enforcement Report.

Earlier this year, we shared some of the steps we have taken to protect our employees and extended workforce during the COVID-19 pandemic. One major step was to rely more on technology to quickly identify and remove content that violates our Community Guidelines so that our teams that review content could safely remain at home. The second quarter of 2020 was the first full quarter we operated under this modified enforcement structure. Because of choices we made to prioritize the safety of the community, we removed the most videos we've ever removed in a single quarter from YouTube. Today, we’re releasing full Q2 data in our quarterly Community Guidelines Enforcement Report.


Because responsibility is our top priority, we chose the latter—using technology to help with some of the work normally done by reviewers.”

Prioritizing the safety of the YouTube Community


We normally rely on a combination of people and technology to enforce our policies. Machine learning helps detect potentially harmful content, and then sends it to human reviewers for assessment. Human review is not only necessary to train our machine learning systems, it also serves as a check, providing feedback that improves the accuracy of our systems over time. Each quarter, millions of videos that are first flagged by our automated systems are later evaluated by our human review team and determined not to violate our policies. 


When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement. One option was to dial back our technology and limit our enforcement to only what could be handled with our diminished review capacity. This would maintain a high level of accuracy, but would result in less content being removed from YouTube, including some content that violates our policies. The other option was to use our automated systems to cast a wider net so that the most content that could potentially harm the community would be quickly removed from YouTube, with the knowledge that many videos would not receive a human review, and some of the videos that do not violate our policies would be removed. 


Because responsibility is our top priority, we chose the latter—using technology to help with some of the work normally done by reviewers. The result was an increase in the number of videos removed from YouTube; more than double the number of videos we removed in the previous quarter.  For certain sensitive policy areas, such as violent extremism and child safety, we accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible. This also means that, in these areas specifically, a higher amount of content that does not violate our policies was also removed. The decision to over-enforce in these policy areas — out of an abundance of caution — led to a more than 3x increase in removals of content our systems suspected was tied to violent extremism or was potentially harmful to children. This includes dares, challenges, or other innocently posted content that might endanger minors. 


Through these challenging times, our commitment to responsibility remains steadfast.”

Minimizing disruption to creators


Because we began to rely more on automation, we also took steps to minimize the disruption to our creators. As we shared earlier, we made a decision to not issue strikes on content removed without human review, except in cases where we have very high confidence that it violates our policies. 


Additionally, we’ve always given creators an easy way to appeal if they believe their video was removed in error. Knowing that decisions made by our system are in some cases less accurate than human review, we prepared for more appeals and dedicated extra resources to make sure they were quickly reviewed. Though the number of appeals remains a small fraction of total removals — less than 3% of video removals — we saw both the number of appeals and the reinstatement rate double from the previous quarter. Notably, the number of videos reinstated on appeal increased from 25% of appeals in Q1 to 50% of appeals in Q2. 


The impact of COVID-19 has been felt in every part of the world, and in every corner of our business. Through these challenging times, our commitment to responsibility remains steadfast. We’ve taken extraordinary steps to make sure we live up to that commitment — protecting viewers by quickly removing content that violates our policies and minimizing the disruption felt by creators. We are continuing to improve the accuracy of our systems and, as reviewers are able to come back to work, we are deploying them to the highest impact areas. We’ll continue to regularly update the community on our progress. 


Subscribe