Updates on our efforts to make YouTube a more inclusive platform
Since its early days, YouTube has always strived to be a place where creators of all backgrounds can have a voice, find a community and even build a business, including many that may be underrepresented or might not otherwise have had a platform. We’re committed to supporting the diverse creator communities on YouTube and their continued success.
As our CEO, Susan Wojcicki, wrote in June, we’re examining how our policies and products are working for everyone — and specifically for the Black community — and working to close any gaps. Today, I want to share an update on that progress. While the work we're sharing today is anchored in this effort, we think these changes will ultimately benefit the entire YouTube community.
Removing harmful and hateful comments
We know that comments play a key role in helping creators connect with their community, but issues with the quality of comments is also one of the most consistent pieces of feedback we receive from creators. We have been focused on improving comments with the goal of driving healthier conversations on YouTube. Over the last few years, we launched new features to help creators engage with their community and shape the tone of conversations on their channels.
We’ve heard from creators that while these changes helped them better manage comments and connect with their audience, there’s more we can do to prevent them from seeing hurtful comments in the first place. To address that, we’ll be testing a new filter in YouTube Studio for potentially inappropriate and hurtful comments that have been automatically held for review, so that creators don’t ever need to read them if they don’t want to. We’ll also be streamlining the comment moderation tools to make this process even easier for creators.
To encourage respectful conversations on YouTube, we’re launching a new feature that will warn users when their comment may be offensive to others, giving them the option to reflect before posting.
In addition, we’ve also invested in technology that helps our systems better detect and remove hateful comments by taking into account the topic of the video and the context of a comment. These efforts are making an impact. Since early 2019, we've increased the number of daily hate speech comment removals by 46x. And in the last quarter, of the more than 1.8 million channels we terminated for violating our policies, more than 54,000 terminations were for hate speech. This is the most hate speech terminations in a single quarter and 3x more than the previous high from Q2 2019 when we updated our hate speech policy.
Continuing our commitment to support creators
Our goal is to make YouTube a place where creators can thrive in the long term, and we’ve done extensive work in this area, but we've heard concerns across various communities about their ability to grow their channels. We want to ensure our systems do not reflect unintentional bias and the existing process is currently limited because we only have information about content, not identifying information about the creators themselves.
To better evaluate a concern from a specific creator community (e.g., concerns that our monetization systems are working differently for different creators) we need to have data about which videos come from which communities. Today, we can identify what a video is about but this does not take into account who the creator is or how they identify. For example, our systems can evaluate how videos about Black Lives Matters are performing against other content on YouTube regardless of the creator, but we’re currently not able to evaluate growth for Black beauty creators, LGBTQ+ talk show hosts, female vloggers or any other community.
Today, we’re announcing a new effort to help us more proactively identify potential gaps in our systems that might impact a creator’s opportunity to reach their full potential. Starting in 2021, YouTube will ask creators on a voluntary basis to provide us with their gender, sexual orientation, race and ethnicity. We’ll then look closely at how content from different communities is treated in our search and discovery and monetization systems. We’ll also be looking for possible patterns of hate, harassment, and discrimination that may affect some communities more than others. This survey will be an additional way for creators to participate in initiatives that YouTube hosts, like #YouTubeBlack creator gatherings and FanFest, if they’re interested.
Identity is something that is inherently personal and sharing this information should always be optional. Our creators’ privacy and ability to provide consent for how their information is used is critical. In the survey, we will explain how information will be used and how the creator controls their information. For example, the information gathered will not be used for advertising purposes, and creators will have the ability to opt-out and delete their information entirely at any time.
YouTube is consulting with creators as we develop our survey, which will launch initially in the US in early 2021, and we’ll continue this project with the guidance of civil and human rights experts. If we find any issues in our systems that impact specific communities, we’re committed to working to fix them. And we’ll continue to share our progress on these efforts with you.
The steps we’re announcing today are part of our ongoing work to ensure that YouTube continues to be a platform where creators of all backgrounds can thrive. We appreciate the partnership of the Black, LGBTQ+ and Latinx creator communities who have consulted with us in these efforts. Thank you for sharing your perspectives with us and helping to make YouTube a better place for everyone.