Are Social Media Algorithms Fueling Political Polarization

Bonisiwe Shabane
-
are social media algorithms fueling political polarization

New research shows the impact that social media algorithms can have on partisan political feelings, using a new tool that hijacks the way platforms rank content. How much does someone’s social media algorithm really affect how they feel about a political party, whether it’s one they identify with or one they feel negatively about? Until now, the answer has escaped researchers because they’ve had to rely on the cooperation of social media platforms. New, intercollegiate research published Nov. 27 in Science, co-led by Northeastern University researcher Chenyan Jia, sidesteps this issue by installing an extension on consenting participants’ browsers that automatically reranks the posts those users see, in real time and still... Jia and her team discovered that after one week, users’ feelings toward the opposing party shifted by about two points — an effect normally seen over three years — revealing algorithms’ strong influence on...

A study shows that the order in which platforms like X display content to their users affects their animosity towards other ideological groups A team of U.S. researchers has shown that the order in which political messages are displayed on social media platforms does affect polarization — one of the most debated issues since the rise of social media and the... The phenomenon is equally strong regardless of the user’s political orientation, the academics note in an article published on Thursday in Science. Social media is an important source of political information. For hundreds of millions of people worldwide, it is even the main channel for political engagement: they receive political content, share it, and express their opinions through these platforms.

Given the relevance of social media in this sphere, understanding how the algorithms that operate on these platforms work is crucial — but opacity is the norm in the industry. That makes it extremely difficult to estimate the extent to which the selection of highlighted content shapes users’ political views. How did the researchers overcome algorithmic opacity to alter the order of posts that social media users see? Tiziano Piccardi from Stanford University and his colleagues developed a browser extension that intercepts and reorders the feed (the chronological timeline of posts) of certain social networks in real time. The tool uses a large language model (LLM) to assign a score to each piece of content, measuring the extent to which it contained “antidemocratic attitudes and partisan animosity” (AAPA). Once scored, the posts were reordered one way or another — without any collaboration from the platform or reliance on its algorithm.

The experiment involved 1,256 participants, who had all been duly informed. The study focused on X, as it is the social network most used in the U.S. for expressing political opinions, and it was conducted during the weeks leading up to the 2024 presidential election to ensure a high circulation of political messages. Lindsey Barrett, Laura Moy, Paul Ohm, Ashkan Soltani Over the past decade, social media has become deeply entwined with American political discourse. Platforms like Facebook, Twitter (now X), YouTube, and TikTok use opaque algorithms to determine what content users see, with profound effects on civic engagement and opinion formation.

In the early days of social networking, many were optimistic – “by connecting people and giving them a voice, social media had become a global force for plurality, democracy and progress,” as The Economist... | Impact). However, as social media’s influence grew, concerns mounted that these platforms were “hijacking democracy” by amplifying extreme voices, disinformation, and populist rhetoric ( Social Media Effects: Hijacking Democracy and Civility in Civic Engagement –... Today, scholars and experts are examining how algorithm-driven feeds may contribute to political polarization and the rise of populism in the United States. This report analyzes: Engagement-Driven Algorithms: Most social media platforms use recommendation algorithms designed to maximize user engagement (likes, shares, view time).

These algorithms learn from each user’s behavior and prioritize content likely to keep them hooked. While this personalization can make feeds more relevant, it also tends to favor provocative or emotionally charged material that generates stronger reactions. Researchers note that “social media technology employs popularity-based algorithms that tailor content to maximize user engagement,” and that maximizing engagement “increases polarization, especially within networks of like-minded users” (How tech platforms fuel U.S. political polarization and what government can do about it). In other words, the more a post incites outrage or passion, the more the algorithms will spread it, potentially skewing the political discourse toward extremes. Facebook: On Facebook, the News Feed algorithm ranks posts based on metrics like comments, shares, and reactions.

Internal studies at Facebook found that this system “exploit[s] the human brain’s attraction to divisiveness” (Facebook Knew Its Algorithms Divided Users, Execs Killed Fixes: Report – Business Insider). In fact, a leaked 2016 Facebook report concluded that “64% of all extremist group joins are due to our recommendation tools”, notably the “Groups You Should Join” and “Discover” algorithms that suggested communities to... This means the platform’s own automated suggestions were steering a majority of users who joined extremist or hyper-partisan groups, dramatically widening those groups’ reach. Facebook’s algorithm changes have also been linked to heightened partisan content. For example, in 2018 Facebook adjusted its feed to emphasize “meaningful social interactions,” but this inadvertently boosted posts that sparked argument and anger—leading to more divisive political content appearing in people’s feeds (Facebook CEO... Although top Facebook executives have publicly downplayed the platform’s role (“some people say… social networks are polarizing us, but that’s not at all clear from the evidence,” CEO Mark Zuckerberg argued (How tech platforms...

political polarization and what government can do about it)), the company’s own documents and actions suggest otherwise. Facebook has occasionally tweaked its algorithms to suppress incendiary posts – such as during the tense period right after the 2020 U.S. election – acknowledging that its automated ranking can fuel extremism (How tech platforms fuel U.S. political polarization and what government can do about it). However, these interventions tend to be temporary, since permanently tamping down divisive content would reduce user engagement (How tech platforms fuel U.S. political polarization and what government can do about it), and thus advertising revenue.

Twitter (X): Twitter initially showed users an unfiltered chronological timeline, but it introduced an algorithmic “Home” timeline and trending topic algorithms that highlight popular tweets. These systems can accelerate the viral spread of polarizing hashtags or sensational political takes. Twitter’s own internal research in 2021 revealed a concerning bias: the algorithm was found to amplify tweets from right-wing politicians and news sources more than those from left-wing sources (Twitter admits bias in algorithm... In other words, Twitter admitted its recommendation system disproportionately boosted certain political content on the right. This kind of amplification can skew the platform’s discourse, making extreme or populist right-wing narratives more visible. (Notably, Twitter’s trending topics have often been dominated by partisan campaigns or outrage-fueled discussions, illustrating how the algorithm magnifies whatever draws engagement, for better or worse.)

YouTube: YouTube’s recommendation algorithm is engineered to maximize watch time by suggesting videos a viewer is likely to click next. In practice, critics have long accused YouTube of leading users down a “rabbit hole” of increasingly extreme content to keep them watching. For instance, a user who starts with an innocuous political video might be recommended slightly more provocative videos, and over time these can escalate to fringe conspiracy theories or hyper-partisan channels. A Northwestern University analysis noted that these algorithms can “amplify inherent human biases” and interfere with normal social learning by over-rewarding sensational content (Social-Media Algorithms Have Hijacked “Social Learning”). While recent studies offer a mixed picture (some find that already-polarized viewers drive the consumption of extremist videos more than casual viewers being radicalized by the algorithm (Study Finds Extremist YouTube Content Mainly Viewed... In 2019, the platform adjusted its algorithm to reduce recommendations of content that “comes close to” violating policies (e.g.

conspiracy theories or disinformation) – a response to evidence that its automated suggestions were promoting such content. Nevertheless, anecdotes of users being “radicalized” via YouTube abound, and the site has hosted influential populist firebrands who built large followings through algorithmic promotion. Chercheur post-doctorant, École normale supérieure (ENS) – PSL Antoine Marie a reçu des financements de la fondation Carlsberg, de l'Université Paris Cité et du CNRS. École Normale Supérieure (ENS) provides funding as a member of The Conversation FR. Once upon a time, newly minted graduates dreamt of creating online social media that would bring people closer together.

That dream is now all but a distant memory. In 2024, there aren’t many ills social networks don’t stand accused of: the platforms are singled out for spreading “fake news”, for serving as Russian and Chinese vehicles to destabilise democracies, as well as... The popular success of documentaries and essays on the allegedly huge social costs of social media illustrates this. In the lead-up to the 2024 U.S. Presidential Election, social media has become the big tent where everyone-from serious Political commentators to your uncle who shares cat memes-comes to discuss politics. But here's the kicker: a lot of the political Content you see on social platforms isn't just from people you follow.

In fact, about half of the tweets on the "For You" timeline come from accounts you don’t even know exist! So, what political thoughts are sneaking into your feed, and how does this play into our democratic conversations? Social media Algorithms are like tense magicians constantly at work behind the curtains. They decide which tweets pop up on our screens, all while we sip our coffee and swipe away. It’s no wonder that figuring out how these algorithms work feels like trying to guess a magician's next trick. Previous studies show that certain political voices get amplified within the tweets of accounts we do follow.

But what about those mysterious tweets from accounts we don’t follow? That’s where the real curiosity lies. As we gear up for the elections, we need to dig into how these algorithms influence the political content we consume and what that means for our points of view. To get to the bottom of this algorithmic mess, we created 120 "sock-puppet" accounts-basically, fake accounts with different political leanings. We set them loose in the wilds of social media to see what political content they were exposed to over three weeks. Our goal?

To find out if users with different political beliefs were being fed different kinds of political content, and whether there was any bias in the recommendations they received. Spoiler alert: we found some intriguing results. It turns out that the algorithm tends to favor a few popular accounts across the board, but right-leaning users got the short end of the stick when it came to Exposure inequality. Both left and right-leaning users saw more of what they agreed with, while encountering less of what they didn’t like. Facebook's logo on a smartphone screen in Moscow. Kirill Kudryavtsev/AFP via Getty Images hide caption

Is Facebook exacerbating America's political divide? Are viral posts, algorithmically ranked feeds, and partisan echo chambers driving us apart? Do conservatives and liberals exist in ideological bubbles online? New research published Thursday attempts to shed light on these questions. Four peer-reviewed studies, appearing in the journals Science and Nature, are the first results of a long-awaited, repeatedly delayed collaboration between Facebook and Instagram parent Meta and 17 outside researchers. They investigated social media's role in the 2020 election by examining Facebook and Instagram before, during, and after Election Day.

While the researchers were able to tap large swaths of Facebook's tightly held user data, they had little direct insight about the inner workings of its algorithms. The design of the social media giant's algorithms — a complex set of systems that determine whether you're shown your friend's vacation snapshots or a reshared political meme — have come under increasing scrutiny... Those fears crystallized in the aftermath of the 2020 election, when "Stop the Steal" groups on Facebook helped facilitate the Jan. 6th Capitol insurrection.

People Also Search

New Research Shows The Impact That Social Media Algorithms Can

New research shows the impact that social media algorithms can have on partisan political feelings, using a new tool that hijacks the way platforms rank content. How much does someone’s social media algorithm really affect how they feel about a political party, whether it’s one they identify with or one they feel negatively about? Until now, the answer has escaped researchers because they’ve had t...

A Study Shows That The Order In Which Platforms Like

A study shows that the order in which platforms like X display content to their users affects their animosity towards other ideological groups A team of U.S. researchers has shown that the order in which political messages are displayed on social media platforms does affect polarization — one of the most debated issues since the rise of social media and the... The phenomenon is equally strong rega...

Given The Relevance Of Social Media In This Sphere, Understanding

Given the relevance of social media in this sphere, understanding how the algorithms that operate on these platforms work is crucial — but opacity is the norm in the industry. That makes it extremely difficult to estimate the extent to which the selection of highlighted content shapes users’ political views. How did the researchers overcome algorithmic opacity to alter the order of posts that soci...

The Experiment Involved 1,256 Participants, Who Had All Been Duly

The experiment involved 1,256 participants, who had all been duly informed. The study focused on X, as it is the social network most used in the U.S. for expressing political opinions, and it was conducted during the weeks leading up to the 2024 presidential election to ensure a high circulation of political messages. Lindsey Barrett, Laura Moy, Paul Ohm, Ashkan Soltani Over the past decade, socia...

In The Early Days Of Social Networking, Many Were Optimistic

In the early days of social networking, many were optimistic – “by connecting people and giving them a voice, social media had become a global force for plurality, democracy and progress,” as The Economist... | Impact). However, as social media’s influence grew, concerns mounted that these platforms were “hijacking democracy” by amplifying extreme voices, disinformation, and populist rhetoric ( So...