Skip to Main Content

Racial justice, equity and product inclusion: How we’re prioritizing safety for our creators and artists

The first in our ongoing series of updates on how YouTube is driving racial justice, equity and product inclusion.

Creators and artists deserve a platform that feels safe, inclusive and equitable. That’s why two years ago, inspired by the global Racial Justice movement and a reckoning that catalyzed change across the world, we established a dedicated Racial Justice, Equity and Product Inclusion team.

That moment provided an important opportunity for us at YouTube to explore the practices, policies and entrenched norms that could reproduce bias and inequity on our platform. We have also dedicated resources across our engineering, product and business teams to drive this work.

Since then, we’ve shared progress around our work to protect creators on the platform, elevate Black creators and artists and make YouTube a more inclusive platform. But even with this momentum, we know there’s more to be done.

Over the coming weeks, we’re going to offer an inside look at how we’re driving continued progress for underrepresented communities in each of our key pillars: safety, equity and advocacy.

In this blog, we take a closer look at our first pillar: safety.


Ensuring creators are safe on YouTube

Over the past few years, we spent time talking with historically underrepresented creators about safety, and one of their key concerns was the rise in comments they saw across social platforms that were hurtful and even hateful. These creators valued YouTube’s commitment to remove any content that violates our Community Guidelines, but they also asked that we introduce updates that would make it possible to easily hide comments that are offensive to them personally.

In response to their feedback, our teams continue to invest in system improvements while also launching new features designed to help creators more easily moderate their comments. Here’s a look at some of that important progress.

● Creators have a new, optional setting that will catch even more potentially inappropriate or spam comments by selecting “Increase Strictness” in their comment settings. These comments are held for review in YouTube Studio, where creators can choose to approve, remove or report them.

● We also made an update to YouTube Studio so that comments that may be considered more hurtful than others are now placed in a separate, hidden section at the bottom of the held for review tab. This way, creators can choose to ignore them completely and leave them unreviewed if they prefer.

● Finally, we started rolling out Channel Guidelines, which allows creators to clearly communicate what is and is not ok in their comments section.


In addition to these updates, over the past few months, we’ve worked to deeply understand how we can offer more support to creators when they experience unwanted behavior. In a recent survey, we learned that while 95 percent of creators experienced unwanted behavior across multiple social platforms, only 50 percent said they had access to resources or the necessary support to handle these interactions.

So in collaboration with creators and third-party experts like ConnectSafely, The Family Online Safety Institute and the National Cybersecurity Alliance, we collected in depth information and tips on topics like how to stay safe when starting out as a new creator, what to do as your channels are growing and how to navigate experiencing things like bullying, trolling, account hijacking and more. Creators can now access all this information in our new Creator Safety Center, which is launching today.

Creator Safety Center

Our Racial Justice, Equity and Product Inclusion team remains committed to exploring and addressing bias and inequity on our platform and we look forward to sharing more progress on this work over the coming weeks.