Meta S Algorithms Show That America S Political Polarisation Has No

Bonisiwe Shabane
-
meta s algorithms show that america s political polarisation has no

FILE - This photo shows the mobile phone app logos for, from left, Facebook and Instagram in New York, Oct. 5, 2021. A team of some of the world’s leading social media researchers has published four studies looking at the relationship between the algorithms used by Facebook and Instagram and America’s widening political divide. (AP Photo/Richard Drew, file) WASHINGTON (AP) — The powerful algorithms used by Facebook and Instagram to deliver content to users have increasingly been blamed for amplifying misinformation and political polarization. But a series of groundbreaking studies published Thursday suggest addressing these challenges is not as simple as tweaking the platforms’ software.

The four research papers, published in Science and Nature, also reveal the extent of political echo chambers on Facebook, where conservatives and liberals rely on divergent sources of information, interact with opposing groups and... Algorithms are the automated systems that social media platforms use to suggest content for users by making assumptions based on the groups, friends, topics and headlines a user has clicked on in the past. While they excel at keeping users engaged, algorithms have been criticized for amplifying misinformation and ideological content that has worsened the country’s political divisions. Proposals to regulate these systems are among the most discussed ideas for addressing social media’s role in spreading misinformation and encouraging polarization. But when the researchers changed the algorithms for some users during the 2020 election, they saw little difference. WASHINGTON — The powerful algorithms used by Facebook and Instagram to deliver content to users have increasingly been blamed for amplifying misinformation and political polarization.

But a series of groundbreaking studies published Thursday suggest addressing these challenges is not as simple as tweaking the software that powers these platforms. The four research papers, published in Science and Nature, also reveal the extent of political echo chambers on Facebook, where conservatives and liberals rely on divergent sources of information, interact with opposing groups and... Algorithms are the automated systems that social media platforms use to suggest content for users by making assumptions based on the groups, friends, topics and headlines a user has clicked on in the past. While they excel at keeping users engaged, algorithms have been criticized for amplifying misinformation and ideological content that has worsened the country’s political divisions. Proposals to regulate these systems are among the most discussed ideas for addressing social media’s role in spreading misinformation and encouraging polarization. But when the researchers changed the algorithms for some users during the 2020 election, they saw little difference.

The former president warned that “citizens no longer know what to believe” thanks to false information spreading online. New research shows the impact that social media algorithms can have on partisan political feelings, using a new tool that hijacks the way platforms rank content. How much does someone’s social media algorithm really affect how they feel about a political party, whether it’s one they identify with or one they feel negatively about? Until now, the answer has escaped researchers because they’ve had to rely on the cooperation of social media platforms. New, intercollegiate research published Nov. 27 in Science, co-led by Northeastern University researcher Chenyan Jia, sidesteps this issue by installing an extension on consenting participants’ browsers that automatically reranks the posts those users see, in real time and still...

Jia and her team discovered that after one week, users’ feelings toward the opposing party shifted by about two points — an effect normally seen over three years — revealing algorithms’ strong influence on... A study shows that the order in which platforms like X display content to their users affects their animosity towards other ideological groups A team of U.S. researchers has shown that the order in which political messages are displayed on social media platforms does affect polarization — one of the most debated issues since the rise of social media and the... The phenomenon is equally strong regardless of the user’s political orientation, the academics note in an article published on Thursday in Science. Social media is an important source of political information.

For hundreds of millions of people worldwide, it is even the main channel for political engagement: they receive political content, share it, and express their opinions through these platforms. Given the relevance of social media in this sphere, understanding how the algorithms that operate on these platforms work is crucial — but opacity is the norm in the industry. That makes it extremely difficult to estimate the extent to which the selection of highlighted content shapes users’ political views. How did the researchers overcome algorithmic opacity to alter the order of posts that social media users see? Tiziano Piccardi from Stanford University and his colleagues developed a browser extension that intercepts and reorders the feed (the chronological timeline of posts) of certain social networks in real time. The tool uses a large language model (LLM) to assign a score to each piece of content, measuring the extent to which it contained “antidemocratic attitudes and partisan animosity” (AAPA).

Once scored, the posts were reordered one way or another — without any collaboration from the platform or reliance on its algorithm. The experiment involved 1,256 participants, who had all been duly informed. The study focused on X, as it is the social network most used in the U.S. for expressing political opinions, and it was conducted during the weeks leading up to the 2024 presidential election to ensure a high circulation of political messages. (Catch all the Technology News News, and Latest News Updates on The Economic Times.) A web-based method was shown to mitigate political polarization on X by nudging antidemocratic and extremely negative partisan posts lower in a user’s feed.

The tool, which is independent of the platform, has the potential to give users more say over what they see on social media.iStock A new tool shows it is possible to turn down the partisan rancor in an X feed — without removing political posts and without the direct cooperation of the platform. The study, from researchers at the University of Washington, Stanford University and Northeastern University, also indicates that it may one day be possible to let users take control of their social media algorithms. The researchers created a seamless, web-based tool that reorders content to move posts lower in a user’s feed when they contain antidemocratic attitudes and partisan animosity, such as advocating for violence or jailing supporters... Researchers published their findings Nov. 27 in Science.

The powerful algorithms used by Facebook and Instagram to deliver content to users have increasingly been blamed for amplifying misinformation and political polarization. But a series of groundbreaking studies published Thursday suggest addressing these challenges is not as simple as tweaking the platforms’ software. (AP) In the papers, researchers from the University of Texas, New York University, Princeton and other institutions found that removing some key functions of the social platforms’ algorithms had “no measurable effects” on people’s political... In one experiment on Facebook’s algorithm, people’s knowledge of political news declined when their ability to reshare posts was removed, the researchers said. By submitting your email, you agree to our Terms of Use and Privacy Policy .

You may opt-out anytime by clicking 'unsubscribe' from the newsletter or from your account. A new Stanford-led study is challenging the idea that political toxicity is simply an unavoidable element of online culture. Instead, the research suggests that the political toxicity many users encounter on social media is a design choice that can be reversed. Researchers have unveiled a browser-based tool that can cool the political temperature of an X feed by quietly downranking hostile or antidemocratic posts. Remarkably, this can occur without requiring any deletions, bans, or cooperation from X itself. The study offers the takeaway that algorithmic interventions can meaningfully reduce partisan animosity while still preserving political speech.

It also advances a growing movement advocating user control over platform ranking systems and the algorithms that shape what they see, which were traditionally guarded as proprietary, opaque, and mainly optimized for engagement rather... The research tool was built by a multidisciplinary team across Stanford, Northeastern University, and the University of Washington, composed of computer scientists, psychologists, communication scholars, and information scientists. Their goal in the experiment was to counter the engagement-driven amplification of divisive content that tends to reward outrage, conflict, and emotionally charged posts, without silencing political speech. Using a large language model, the tool analyzes posts in real time and identifies several categories of harmful political subject matter, including calls for political violence, attacks on democratic norms, and extreme hostility toward... When the system flags such content, it simply pushes those posts lower in the feed so they are less noticeable, like seating your argumentative uncle at the far end of the table during the...

People Also Search

FILE - This Photo Shows The Mobile Phone App Logos

FILE - This photo shows the mobile phone app logos for, from left, Facebook and Instagram in New York, Oct. 5, 2021. A team of some of the world’s leading social media researchers has published four studies looking at the relationship between the algorithms used by Facebook and Instagram and America’s widening political divide. (AP Photo/Richard Drew, file) WASHINGTON (AP) — The powerful algorithm...

The Four Research Papers, Published In Science And Nature, Also

The four research papers, published in Science and Nature, also reveal the extent of political echo chambers on Facebook, where conservatives and liberals rely on divergent sources of information, interact with opposing groups and... Algorithms are the automated systems that social media platforms use to suggest content for users by making assumptions based on the groups, friends, topics and headl...

But A Series Of Groundbreaking Studies Published Thursday Suggest Addressing

But a series of groundbreaking studies published Thursday suggest addressing these challenges is not as simple as tweaking the software that powers these platforms. The four research papers, published in Science and Nature, also reveal the extent of political echo chambers on Facebook, where conservatives and liberals rely on divergent sources of information, interact with opposing groups and... A...

The Former President Warned That “citizens No Longer Know What

The former president warned that “citizens no longer know what to believe” thanks to false information spreading online. New research shows the impact that social media algorithms can have on partisan political feelings, using a new tool that hijacks the way platforms rank content. How much does someone’s social media algorithm really affect how they feel about a political party, whether it’s one ...

Jia And Her Team Discovered That After One Week, Users’

Jia and her team discovered that after one week, users’ feelings toward the opposing party shifted by about two points — an effect normally seen over three years — revealing algorithms’ strong influence on... A study shows that the order in which platforms like X display content to their users affects their animosity towards other ideological groups A team of U.S. researchers has shown that the or...