Algorithms Can Pull Us Apart This Tool Shows They Can Bring Us Back

Bonisiwe Shabane
-
algorithms can pull us apart this tool shows they can bring us back

A small tweak to your social media feed can make your opponents feel a little less like enemies. In a new study published in Science, a Stanford-led team used a browser extension and a large language model to rerank posts on X during the 2024 U.S. presidential campaign, showing that changing the visibility of the most hostile political content can measurably dial down partisan heat without deleting a single post or asking the platform for permission. The experiment, run with 1,256 Democrats and Republicans who used X in the weeks after an attempted assassination of Donald Trump and the withdrawal of Joe Biden from the race, targeted a particular kind... The researchers focused on posts that expressed antidemocratic attitudes and partisan animosity, such as cheering political violence, rejecting bipartisan cooperation, or suggesting that democratic rules are expendable when they get in the way of... To reach inside a platform they did not control, first author Tiziano Piccardi and colleagues built a browser extension that quietly intercepted the web version of the X timeline.

Every time a participant opened the For you feed, the extension captured the posts, sent them to a remote backend, and had a large language model score each political post on eight dimensions of... If a post hit at least four of those eight factors, it was tagged as the kind of content most likely to inflame. The tool then reordered the feed for consenting users in real time. In one experiment, it pushed those posts down the feed so participants would need to scroll further to hit the worst material. In a parallel experiment, it did the opposite and pulled that content higher. “Social media algorithms directly impact our lives, but until now, only the platforms had the ability to understand and shape them,” said Michael Bernstein, a professor of computer science in Stanford’s School of Engineering...

“We have demonstrated an approach that lets researchers and end users have that power.” Moody College researcher leads unprecedented study with Meta exploring the role of social media in elections In the aftermath of the 2016 election, politicians, the media and everyday people raised numerous concerns about the effects of social media on democracy and how platforms like Facebook and Instagram influence people’s political... What role do these powerful social networks and the algorithms that run them have in how people view candidates or feel about important issues? Over the past several years, a multi-university academic team has been working alongside Meta to answer these very important questions as part of an unprecedented research project co-led by Moody College Communication Studies professor... As part of the project, the team had access to data from Meta that has never before been made available to researchers and were given the ability to alter the Facebook and Instagram feeds...

In the summer of 2023, researchers released their first findings from the project in a sweep of papers published in both Nature and Science journals. And while they found that algorithms have a tremendous effect on what people see on their feeds, changing these algorithms to change what people see doesn’t necessarily affect people’s political attitudes. Also, when the researchers looked at platform-wide data from U.S. adults, they found that many political news URLs were seen, and engaged with, primarily by conservatives or liberals, but not both. FILE - This photo shows the mobile phone app logos for, from left, Facebook and Instagram in New York, Oct. 5, 2021.

A team of some of the world’s leading social media researchers has published four studies looking at the relationship between the algorithms used by Facebook and Instagram and America’s widening political divide. (AP Photo/Richard Drew, file) WASHINGTON (AP) — The powerful algorithms used by Facebook and Instagram to deliver content to users have increasingly been blamed for amplifying misinformation and political polarization. But a series of groundbreaking studies published Thursday suggest addressing these challenges is not as simple as tweaking the platforms’ software. The four research papers, published in Science and Nature, also reveal the extent of political echo chambers on Facebook, where conservatives and liberals rely on divergent sources of information, interact with opposing groups and... Algorithms are the automated systems that social media platforms use to suggest content for users by making assumptions based on the groups, friends, topics and headlines a user has clicked on in the past.

While they excel at keeping users engaged, algorithms have been criticized for amplifying misinformation and ideological content that has worsened the country’s political divisions. Proposals to regulate these systems are among the most discussed ideas for addressing social media’s role in spreading misinformation and encouraging polarization. But when the researchers changed the algorithms for some users during the 2020 election, they saw little difference. Facebook's logo on a smartphone screen in Moscow. Kirill Kudryavtsev/AFP via Getty Images hide caption Is Facebook exacerbating America's political divide?

Are viral posts, algorithmically ranked feeds, and partisan echo chambers driving us apart? Do conservatives and liberals exist in ideological bubbles online? New research published Thursday attempts to shed light on these questions. Four peer-reviewed studies, appearing in the journals Science and Nature, are the first results of a long-awaited, repeatedly delayed collaboration between Facebook and Instagram parent Meta and 17 outside researchers. They investigated social media's role in the 2020 election by examining Facebook and Instagram before, during, and after Election Day. While the researchers were able to tap large swaths of Facebook's tightly held user data, they had little direct insight about the inner workings of its algorithms.

The design of the social media giant's algorithms — a complex set of systems that determine whether you're shown your friend's vacation snapshots or a reshared political meme — have come under increasing scrutiny... Those fears crystallized in the aftermath of the 2020 election, when "Stop the Steal" groups on Facebook helped facilitate the Jan. 6th Capitol insurrection. How have social media algorithms changed the way we interact? Social media algorithms, in their commonly known form, are now 15 years old. They were born with Facebook’s introduction of ranked, personalised news feeds, external in 2009 and have transformed how we interact online.

And like many teenagers, they pose a challenge to grown-ups who hope to curb their excesses. It’s not for want of trying. This year alone, governments around the world have attempted to limit the impacts of harmful content and disinformation on social media – effects that are amplified by algorithms. "Do you remember the 1990s? It felt like we were building a digital utopia. The internet was going to connect the world, democratize information, and maybe even bring peace.

But what if I told you that the same algorithms we designed to make life easier are now pulling us apart? Let’s start at the beginning." "Imagine you invent a knife to cut bread, and years later it’s used as a weapon. That’s how it felt when we realized the algorithms we created were being misused to manipulate people, polarize societies, and even influence elections." "The problem wasn’t the technology itself but how it was used – and by whom. Algorithms were meant to help us navigate the chaos of the internet.

But now, they’re fueling a different kind of chaos." "Have you ever Googled something trivial, like sneakers, and then been bombarded with ads for weeks? That’s algorithms using your data to predict your behavior. It’s not magic; it’s math. But this isn’t the real problem." "The same algorithms deciding your ads are deciding your news feed.

They’re amplifying anger, fear, and outrage because those emotions keep you glued to the screen. Take the last U.S. elections: social media platforms deliberately pushed polarizing content – conspiracy theories, fake news – to maximize engagement. And it worked. Today, 52% of Trump voters and 40% of Biden voters support splitting the U.S. into two nations.

We’re not just divided; we’re isolated." Posted November 28, 2024 | Reviewed by Jessica Schrader Co-authored by Nigel Bairstow, Ph.D., and Jeremy Neofytos, Research Assistant Over the last seven years, the customer experience on social media has shifted dramatically, moving away from the early ideals of user-led connection toward platforms that increasingly control users' digital lives. Originally, social media was designed to allow people to stay connected with friends, share their experiences, and engage in discussions they cared about. However, recent changes—the rise of algorithm-driven feeds and the integration of short-form video content—have transformed the platforms.

This results in users being stripped gradually of their agency, turning these platforms into data-driven tools where algorithms decide what users see and when. One major move in social media has been the algorithmic feed, which has replaced the simple chronological timeline many platforms once used. Initially, users could control their feeds by choosing whom to follow, with posts from these connections appearing in real-time order. This empowered users, allowing them to shape their social media experience based on personal preferences and relationships. Now, we find that most platforms have moved to popularity-based algorithms that prioritise content based on its likelihood of generating engagement. Through complex metrics analysing user demographics, behaviours, and trends, these algorithms organise content around how to maximize time spent on the platform.

The result? Users we find often view content from popular creators or trending topics instead of updates from friends and selected accounts. This change has resulted in the “follow” button almost worthless, as users no longer control what they view—content is now dictated by the platform's engagement goals rather than by individual choice. The powerful algorithms used by Facebook and Instagram to deliver content to users have increasingly been blamed for amplifying misinformation and political polarization. But a series of groundbreaking studies published Thursday suggest addressing these challenges is not as simple as tweaking the platforms' software. Video above: Big Tech rolls back misinformation rules ahead of 2024 elections

The four research papers, published in Science and Nature, also reveal the extent of political echo chambers on Facebook, where conservatives and liberals rely on divergent sources of information, interact with opposing groups and... Algorithms are the automated systems that social media platforms use to suggest content for users by making assumptions based on the groups, friends, topics and headlines a user has clicked on in the past. While they excel at keeping users engaged, algorithms have been criticized for amplifying misinformation and ideological content that has worsened the country's political divisions. Proposals to regulate these systems are among the most discussed ideas for addressing social media's role in spreading misinformation and encouraging polarization. But when the researchers changed the algorithms for some users during the 2020 election, they saw little difference.

People Also Search

A Small Tweak To Your Social Media Feed Can Make

A small tweak to your social media feed can make your opponents feel a little less like enemies. In a new study published in Science, a Stanford-led team used a browser extension and a large language model to rerank posts on X during the 2024 U.S. presidential campaign, showing that changing the visibility of the most hostile political content can measurably dial down partisan heat without deletin...

Every Time A Participant Opened The For You Feed, The

Every time a participant opened the For you feed, the extension captured the posts, sent them to a remote backend, and had a large language model score each political post on eight dimensions of... If a post hit at least four of those eight factors, it was tagged as the kind of content most likely to inflame. The tool then reordered the feed for consenting users in real time. In one experiment, it...

“We Have Demonstrated An Approach That Lets Researchers And End

“We have demonstrated an approach that lets researchers and end users have that power.” Moody College researcher leads unprecedented study with Meta exploring the role of social media in elections In the aftermath of the 2016 election, politicians, the media and everyday people raised numerous concerns about the effects of social media on democracy and how platforms like Facebook and Instagram inf...

In The Summer Of 2023, Researchers Released Their First Findings

In the summer of 2023, researchers released their first findings from the project in a sweep of papers published in both Nature and Science journals. And while they found that algorithms have a tremendous effect on what people see on their feeds, changing these algorithms to change what people see doesn’t necessarily affect people’s political attitudes. Also, when the researchers looked at platfor...

A Team Of Some Of The World’s Leading Social Media

A team of some of the world’s leading social media researchers has published four studies looking at the relationship between the algorithms used by Facebook and Instagram and America’s widening political divide. (AP Photo/Richard Drew, file) WASHINGTON (AP) — The powerful algorithms used by Facebook and Instagram to deliver content to users have increasingly been blamed for amplifying misinformat...