What Is Political Bias Youtube
More moderation associated with more hate speech and misinformation, not politics. In August 2018, President Donald Trump claimed that social media was “totally discriminating against Republican/Conservative voices.” Not much was new about this: for years, conservatives have accused tech companies of political bias. Just last July, Senator Ted Cruz (R-Texas) asked the FTC to investigate the content moderation policies of tech companies like Google. A day after Google’s vice president insisted that YouTube was apolitical, Cruz claimed that political bias on YouTube was “massive.” But the data doesn’t back Cruz up—and it’s been available for a while. While the actual policies and procedures for moderating content are often opaque, it is possible to look at the outcomes of moderation and determine if there’s indication of bias there.
And, last year, computer scientists decided to do exactly that. Motivated by the long-running argument in Washington, DC, computer scientists at Northeastern University decided to investigate political bias in YouTube’s comment moderation. The team analyzed 84,068 comments on 258 YouTube videos. At first glance, the team found that comments on right-leaning videos seemed more heavily moderated than those on left-leaning ones. But when the researchers also accounted for factors such as the prevalence of hate speech and misinformation, they found no differences between comment moderation on right- and left-leaning videos. “There is no political censorship,” said Christo Wilson, one of the co-authors and associate professor at Northeastern University.
“In fact, YouTube appears to just be enforcing their policies against hate speech, which is what they say they’re doing.” Wilson’s collaborators on the paper were graduate students Shan Jiang and Ronald Robertson. The study was published in PNAS Nexus. Three hundred sixty bots simulated new YouTube users to isolate the onboarding and subsequent algorithmic changes. The 360 bots were divided into six political affiliations: Far Left, Left, Center, Anti-woke, Right and Far Right. The researchers varied the videos the bots watched using the top 20 algorithmic recommendations to draw some conclusions. Before any videos were watched - Only 3% of the initial recommendations were “News and Politics.” Of those, 51% were classified as Center, 42% Left, and 6% Right.
The Left-leaning videos were viewed far more frequently than any of the other political classes. Building the bot's political persona - Most recommendations quickly match the bot’s political affiliation. Even after viewing one video, recommendations matched 78% for those on the Left and 70% for the Anti-woke. The political extremes were offered more of the Left or Right videos. The classification of videos also varied Escaping the now-established political persona – By watching videos of a political affiliation other than the one established.
The speed of escape was the number of videos needed to be seen before the recommendations matched the new political persona was asymmetric. Algorithmic Path - Researchers looked at how algorithmic recommendations changed after the attempt to escape the bot’s initial political affiliation. The strongest tendency was to return recommendations to those of the Center or Left, the weakest to the Far Left or Far Right – the algorithm returned to the “political mean” and away from... YouTube is used by 71% of Americans and is a source of news for 26% of US adults. The platform is undoubtedly playing an important role in shaping America’s views on a range of political and cultural topics. While the impact of YouTube continues to grow, options for understanding the content and ideas being shared on the platform are lacking.
That is why we built Transparency.tube. By categorizing, indexing, and analyzing over 7,300 of the largest English language YouTube channels actively discussing political and cultural issues, we aim to provide the data necessary to better understand this space. The huge amount of politically oriented content on YouTube, 1.9M videos in 2020 alone in our dataset, makes it impossible for any individual to fully track what's occurring at any given moment. There has traditionally been an absence of reliable data when it comes to the internal and external workings of YouTube. Consequently, reporters have historically resorted to using anecdotal evidence and small sample sizes of videos when reporting on the platform. Further, some have continued to articulate narratives that have not kept pace with contemporary changes occurring on the YouTube platform, such as the updating of recommendation systems and the evolution of political content.
One solution for dealing with such a large amount of content is to leverage AI or Machine Learning. As one of the worldwide leaders in this domain, Google clearly has the ability to thoroughly analyze the political and cultural ideas being shared on YouTube. However, they have almost no incentive to. YouTube has been the target of intense criticism from both the left and right. This criticism has centered on the content they allow on their platform and how this content is promoted through their recommendation system. YouTube has taken a very reticent approach to the data they share concerning hot button topics.
A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.© Copyright 2025 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions. Political bias refers to the tendency of individuals or media outlets to favor one political ideology or party over others, leading to a distorted presentation of information. This bias can manifest in various forms, such as selective reporting, slanting language, and framing issues in a way that promotes a specific political agenda. Understanding political bias is essential for evaluating the credibility of information sources and recognizing how biases shape public perception and discourse. Confirmation Bias: The psychological tendency to search for, interpret, and remember information in a way that confirms one’s pre-existing beliefs.
The ability to access, analyze, evaluate, and create media in various forms, empowering individuals to critically engage with media content. Framing: The way information is presented or structured in media narratives, which can influence how audiences perceive and interpret issues. Podcasting on Rumble, a different video platform, brings in controversial figures who often share polarizing and misleading ideas. This is quite different from YouTube, which has stricter rules about the Content that can be shared. With the growing influence of Podcasts in shaping political discussions, particularly with popular personalities like Joe Rogan and Andrew Tate, it's crucial to look at the political biases and content styles used on these... In this study, we analyze over 13,000 podcast videos from both YouTube and Rumble.
Our main focus is on their political content and audience dynamics. Using methods such as speech-to-text transcription and topic modeling, we examine three main areas: the Political Bias in podcasts, the type of content that attracts Viewers, and the use of Visuals in the podcasts. Our findings highlight a clear right-wing leaning in Rumble's podcasts, while YouTube's content is more diverse and less politically charged. In today's society, visuals are a key part of how we communicate and engage with information. The rise of social media and video content has changed the way we consume media, making video podcasts increasingly popular. The popularity of video podcasts has surged, helping YouTube to catch up with Spotify in 2022, eventually becoming the leading platform for podcasts in 2023.
One notable example of this trend is Joe Rogan, known for his controversial statements and actions, such as using racial slurs. In 2022, he moved his content exclusively to Spotify after accepting a $200 million offer. Interestingly, he declined a $100 million offer from Rumble. Another notable figure is Andrew Tate, who has faced bans from multiple social media platforms due to his misogynistic views. Despite these bans, he has gained over 1.7 million followers on Rumble. The list of controversial figures on Rumble also includes former President Trump and conspiracy theorist Alex Jones, both of whom have been banned from other platforms.
Despite claims of neutrality from Rumble's owner, research suggests that a significant portion of its user base leans toward the Republican Party.
People Also Search
- What is Political Bias? - YouTube
- Researchers have already tested YouTube's algorithms for political bias
- YouTube's recommendation algorithm is left-leaning in the United States ...
- A Counterintuitive Bias in YouTube's Algorithm
- transparency.tube
- Examining Political Bias within YouTube Search and Recommendation ...
- Political bias - (Media Literacy) - Vocab, Definition, Explanations ...
- Exploring Political Bias: Understanding Its Impact
- Analyzing Political Bias in Podcasts: Rumble vs. YouTube
- PDF Echo Chambers, Rabbit Holes, and Algorithmic Bias: How YouTube ...
More Moderation Associated With More Hate Speech And Misinformation, Not
More moderation associated with more hate speech and misinformation, not politics. In August 2018, President Donald Trump claimed that social media was “totally discriminating against Republican/Conservative voices.” Not much was new about this: for years, conservatives have accused tech companies of political bias. Just last July, Senator Ted Cruz (R-Texas) asked the FTC to investigate the conten...
And, Last Year, Computer Scientists Decided To Do Exactly That.
And, last year, computer scientists decided to do exactly that. Motivated by the long-running argument in Washington, DC, computer scientists at Northeastern University decided to investigate political bias in YouTube’s comment moderation. The team analyzed 84,068 comments on 258 YouTube videos. At first glance, the team found that comments on right-leaning videos seemed more heavily moderated tha...
“In Fact, YouTube Appears To Just Be Enforcing Their Policies
“In fact, YouTube appears to just be enforcing their policies against hate speech, which is what they say they’re doing.” Wilson’s collaborators on the paper were graduate students Shan Jiang and Ronald Robertson. The study was published in PNAS Nexus. Three hundred sixty bots simulated new YouTube users to isolate the onboarding and subsequent algorithmic changes. The 360 bots were divided into s...
The Left-leaning Videos Were Viewed Far More Frequently Than Any
The Left-leaning videos were viewed far more frequently than any of the other political classes. Building the bot's political persona - Most recommendations quickly match the bot’s political affiliation. Even after viewing one video, recommendations matched 78% for those on the Left and 70% for the Anti-woke. The political extremes were offered more of the Left or Right videos. The classification ...
The Speed Of Escape Was The Number Of Videos Needed
The speed of escape was the number of videos needed to be seen before the recommendations matched the new political persona was asymmetric. Algorithmic Path - Researchers looked at how algorithmic recommendations changed after the attempt to escape the bot’s initial political affiliation. The strongest tendency was to return recommendations to those of the Center or Left, the weakest to the Far Le...