Skip to Main Content

Supporting the 2024 United States election

As the US election season kicks into gear, citizens across the country come to YouTube for news and information, from voter registration to the location of their nearest polling place. Candidates across the political spectrum also use YouTube in their campaign outreach, including for voters tougher to reach through traditional campaign methods. I wanted to share an overview of how we’re connecting people to authoritative election news and information on YouTube.

We dedicate significant time and resources to establish and refine the policies and systems that effectively connect people to trustworthy, high-quality election news and information on YouTube.

Elevating high-quality election content

We dedicate significant time and resources to establish and refine the policies and systems that effectively connect people to trustworthy, high-quality election news and information on YouTube. The 2024 election is no exception. Our elections-focused teams have been working nonstop to make sure we have the right policies and systems in place.

YouTube has long been a platform for political discussion and debate. We believe voters should hear all sides of a candidate’s platform so they can make informed decisions, even when those views may be controversial or questionable. It’s why, when it comes to political speech on YouTube, we take extra consideration for content with educational or documentary value, such as news coverage. But this isn’t a free pass to spread harmful misinformation or promote hateful rhetoric.

Everyone is subject to our Community Guidelines, from private citizens to the most visible public figures. And our policies apply to all forms of content, including elections — regardless of the political viewpoints expressed, the language the content is in, or how the content is generated. Content that misleads voters on how to vote or encourages interference in the democratic process is not allowed on YouTube. And we quickly remove content that incites violence, encourages hatred, promotes harmful conspiracy theories, or threatens election workers. At the same time, our systems recommend election news and information from authoritative sources and display information panels at the top of search results and below videos to provide even more context.

When it comes to advertising and monetization, we’re even more stringent. We don’t allow ads promoting demonstrably false claims that could undermine trust or participation in elections, nor do we allow ads to run on videos with this type of content. Our ads and monetization policies apply to all advertisers and creators on YouTube, across the political spectrum.

Addressing AI-generated election misinformation

Challenges posed by generative AI have been an ongoing area of focus for YouTube, but we know AI introduces new risks that bad actors may try to exploit during an election. Our misinformation policies prohibit technically manipulated content that misleads users and could pose a serious risk of egregious harm. And for election ads, we require advertisers to disclose when their ads include digitally altered or generated materials.

But as AI becomes more sophisticated, it may be difficult for viewers to discern when the content they’re watching is real. As we shared previously, over the coming months, we’ll require creators to disclose when they’ve created altered or synthetic content that’s realistic. This will include election content, and we may take action against creators who consistently do not disclose this information. We’ll also take the additional step of labeling altered or synthetic election content that doesn’t violate our policies, to clearly indicate for viewers that some of the content was altered or generated digitally. This label will be displayed in both the video player and the video description, and will surface regardless of the creator, political viewpoints or language.

Mock showing label added to the video player and description panel.

Mock showing label added to the video player and description panel.

More like this

Combating Influence operations from foreign adversaries

We’ve long worked closely with Google’s Threat Analysis Group (TAG) to combat government-backed attacks. Together, we identify coordinated influence operations on YouTube and terminate their channels and accounts. This includes government-backed hacking designed to interfere with elections, including the upcoming 2024 elections. Through TAG, we also work with other technology companies to share intelligence and best practices, and share threat information with law enforcement. TAG discloses coordinated influence operation campaigns terminated on Google platforms, including YouTube, in their monthly Bulletin.


Supporting elections around the world

YouTube is a global video streaming platform, and our policies and systems are developed and implemented with this worldwide view from the start. And the work of our elections-focused teams is continuous, whether we’re in an election year or not. Our policies are enforced across languages and locales, and our systems connect voters to authoritative news and information about the elections that are most relevant to them in all the countries we operate. This includes 2024, and we’re closely monitoring real-time developments around the world.

As we head towards Election Day in the US, dedicated teams across YouTube and Google are working to refine and improve upon our efforts. We remain vigilant as the election unfolds, following and identifying relevant trends around inappropriate content and problematic behaviors so we can quickly address them before they escalate. Our commitment to connecting people to high-quality content and protecting our platform throughout the 2024 election is steadfast, and we’ll continue to share updates on our efforts.

More like this

Subscribe