Skip to Main Content

Free Speech and Corporate Responsibility Can Coexist Online

CEO Susan Wojcicki shares three principles that should guide discussions about the regulation of online speech.

This op-ed was originally published in the Wall Street Journal.


When I was growing up, every time I wrote a letter to my grandfather I worried it might be censored. My father had fled communist Poland for the U.S., but my grandfather was unable to escape and still lived behind the Iron Curtain. I learned very young that it can be dangerous when governments reach too far.

At YouTube, we’re working to protect our community while enabling new and diverse voices to break through. Three principles should guide discussions about the regulation of online speech.

As CEO of YouTube, I grapple every day with issues related to free expression and responsibility. Companies, civil society and governments are facing unprecedented challenges and sorting through complicated questions, determining where to draw the lines on speech in the 21st century. Policy makers around the world are introducing regulatory proposals—some argue that too much content is left up on platforms, while others say too much is taken down. At YouTube, we’re working to protect our community while enabling new and diverse voices to break through. Three principles should guide discussions about the regulation of online speech.

First, the open internet has transformed society in incredible ways. The Group of Seven leaders reaffirmed the fundamental value of openness in a recent statement. YouTube makes information available to anyone with an internet connection. People around the world come to YouTube to find information, to learn and to build community. But creating a space that’s open to everyone means that bad actors will sometimes cross the line.

YouTube has always had community guidelines that set the rules of the road. We remove content that could cause real harm, such as violent extremism, copyright infringement and dangerous pranks. Some of our decisions are controversial, but we apply our policies equally, regardless of who posts the content or the political viewpoint expressed. At the same time, we embrace the inherent complexity and messiness of the internet. Stripping away everything that’s controversial could silence important voices and ideas.

The second principle: Democratic governments must provide companies with clear guidelines about illegal speech. That helps us remove illegal content more quickly and efficiently. These laws must be grounded in international norms as officials balance the right to information with the risk of harm. The rules governing the internet are regularly updated, from copyright to elections and political campaigning. YouTube is willing to work with governments to address these and other issues.

Companies should have flexibility to develop responsible practices to handle legal but potentially harmful speech.

But not everything about content moderation will be overseen by governments, which is why I believe strongly in the third principle: Companies should have flexibility to develop responsible practices to handle legal but potentially harmful speech. Some policy makers are debating what legal speech should be allowed on platforms, but such prescriptive rules could have serious consequences.

Say officials decide to regulate legal content they consider graphic. That may lead to the removal of protest footage, video games and music videos. Evidence on YouTube helped prosecutors in Sweden hold the Syrian regime and rebel fighters accountable for war crimes. What if those videos had been taken down because they were deemed too graphic?

Companies also need to be able to act fast when new threats arise. Last year when mobile-phone towers in the U.K. were set on fire after a conspiracy theory blamed Covid-19 on 5G networks, we updated our policies in a single day to remove the harmful content. Our community counts on us to take action, and we need to continue to be able to move quickly.

Some may say that governments should oversee online speech, but we need flexibility to strike the right balance between openness and responsibility. When we get it wrong or lean too heavily in either direction, our business and the millions of creator small businesses built on YouTube are hurt. Advertisers have pulled spend from YouTube when their ads ran next to problematic content.

We work hard every day to be responsible, and our advertisers, users and creators hold us accountable. We’re working with the Global Alliance for Responsible Media to develop industry definitions of content not suitable for advertising. In addition, we’re a founding member of the Global Internet Forum to Counter Terrorism, an organization that works to prevent violent extremists from exploiting digital platforms. We also provide users with tools and controls to manage their experience on YouTube.

Managing our platform responsibly is good for business. We’re also working to provide more transparency about our efforts. We recently released our Violative View Rate, which estimates how frequently viewers see content that violates our policies. The rate fell by more than 70% compared with 2017, thanks in large part to investments in machine learning that help flag potentially violative content. In the first quarter, the rate was 0.16% to 0.18%. That means that out of every 10,000 views on YouTube, 16 to 18 came from violative content.

The stakes are high for updating our approach to online speech. Overregulation of legal content would have a chilling effect on speech and could rob us of the next big idea or great discovery. I’m confident there is a way forward that both keeps our community safe and allows for free expression.

Subscribe