Platform Independent Social Media Experiments A Science Approach
Changing algorithms with artificial intelligence tools can influence partisan animosity. Political polarization is a defining feature of modern society, and increasingly, research points to a surprising culprit: the algorithms powering our online experiences. Artificial intelligence (AI) tools, designed to personalize content and maximize engagement, are inadvertently exacerbating partisan animosity by creating echo chambers and reinforcing existing biases. This isn’t a deliberate attempt to divide, but rather an unintended outcome of optimizing for metrics like clicks and time spent on platforms. What: AI algorithms are contributing to increased political polarization. Where: Primarily online, across social media platforms and search engines.
When: The effect has become increasingly pronounced in the last decade, coinciding with the widespread adoption of AI-driven personalization. Why it Matters: increased polarization undermines democratic discourse and can lead to political instability. As social media platforms become ever more ubiquitous, policymakers are grappling with how to counter their effects on political attitudes and electoral outcomes, addiction and mental health, misinformation and toxic content. This column suggests practical ways that academic researchers can help guide the design of government regulations aiming to address these issues. The authors explain how to run experiments that use social media platforms to recruit subjects, how to harness platform features and technologies to collect data and generate variation, and limitations to consider when conducting... Social media platforms have become ubiquitous in modern economies.
As of 2023, there were more than five billion active social media users worldwide, representing over 60% of the world population. In the US, the average user spent 10.5% of their lives on these services (Kemp 2024). Partially due to the increasing share of time that users spend on social media, policymakers have raised concerns that these platforms can influence political attitudes and electoral outcomes (Fujiwara et al. 2020), lead to significant mental health and addiction challenges (Braghieri et al. 2022), and expose consumers to misinformation and toxic content (Jiménez-Durán et al. 2022).
In addition, the dominant social media platforms have considerable market power; as such, it is not clear that market competition can help resolve these policy concerns. Regulators in the EU have implemented several policies to deal with these issues – such as the Digital Markets Act (DMA), the General Data Protection Regulation (GDPR), and the Digital Services Act (DSA) –... How can the research community provide evidence to help guide the design of such regulations? One option is to empirically evaluate policies after they have been implemented – as has been the case for the EU’s GDPR and DMA, Apple’s ATT, and the German NetzDG Law – which can... 2022, Aridor et. al 2024, Johnson 2024, Pape et.
al 2025). This provides policymakers with meaningful evidence only after years of implementation and only by evaluating policies that were actually implemented, not counterfactual policies that were considered. Another option is to have platforms explicitly conduct experiments simulating the effects of proposed policy interventions (Guess et. al 2023, Nyhan et. al 2023, Wernerfelt et. al 2025).
This option comes with its own set of challenges, as it provides platforms with outsized influence on the type of questions and interventions that can be studied, as the platforms are not impartial agents... al 2023a, b). In a forthcoming chapter in the Handbook of Experimental Methods in the Social Sciences (Aridor et al. 2025), we provide a practical guide to a third option that exploits how third-party technologies and platform features can be used for researcher-generated experimental variation. Our method combines the best of the two aforementioned options: it is accessible to researchers without requiring explicit platform cooperation, and it allows for counterfactual policy evaluation before the implementation of a chosen policy. Our paper provides detailed documentation for running such experiments: starting from using social media platforms for recruitment of experimental subjects, documenting how to use a combination of platform features and technologies such as Chrome...
Overall, this methodology serves as a powerful toolkit to study policy issues not only on social media platforms, but also on platforms such as Amazon (Farronato et. al 2024), Google Search (Allcott et. al 2025), and YouTube (Aridor forthcoming). We document several experiments that we conducted and explain how they relate to policy challenges. Two of our core faculty, Joshua Tucker and Jenny Allen, recently published a perspectives piece in Science in response to the recently published article, "Reranking partisan animosity in algorithmic social media feeds alters affective... Social media is an important source of political information, yet there is little external oversight of platforms’ ever-changing algorithms and policies.
This opacity presents a major problem: Conducting a real-world experiment on the causal effects of platform features generally requires the collaboration of the platform being studied, which rarely happens, and even when it does,... On page 903 of this issue, Piccardi et al. report one possible solution to this challenge. The authors introduce a methodological paradigm for testing the effect of social media on partisan animosity without platform collaboration by reranking users’ existing feeds using large language models (LLMs) and a browser extension. They find that changing the visibility of polarizing content can influence people’s feelings about opposing partisans. Science.
2025 Nov 27;390(6776):883-884. doi: 10.1126/science.aec7388. Epub 2025 Nov 27. Changing algorithms with artificial intelligence tools can influence partisan animosity. PMID:41308160 | DOI:10.1126/science.aec7388
People Also Search
- Platform-independent experiments on social media | Science
- Platform-independent experiments on social media - PubMed
- Platform-independent experiments on social media. - Semantic Scholar
- PDF Microsoft Word - Moslet_et_al_field_experiment_socialmedia.docx
- Field experiments on social media. - APA PsycNet
- Platform-Independent Social Media Experiments: A Science Approach
- A practical guide to running social media experiments | CEPR
- Platform-Independent Experiments on Social Media
- (PDF) Field Experiments on Social Media - ResearchGate
Changing Algorithms With Artificial Intelligence Tools Can Influence Partisan Animosity.
Changing algorithms with artificial intelligence tools can influence partisan animosity. Political polarization is a defining feature of modern society, and increasingly, research points to a surprising culprit: the algorithms powering our online experiences. Artificial intelligence (AI) tools, designed to personalize content and maximize engagement, are inadvertently exacerbating partisan...
When: The Effect Has Become Increasingly Pronounced In The Last
When: The effect has become increasingly pronounced in the last decade, coinciding with the widespread adoption of AI-driven personalization. Why it Matters: increased polarization undermines democratic discourse and can lead to political instability. As social media platforms become ever more ubiquitous, policymakers are grappling with how to counter their effects on political attitudes an...
As Of 2023, There Were More Than Five Billion Active
As of 2023, there were more than five billion active social media users worldwide, representing over 60% of the world population. In the US, the average user spent 10.5% of their lives on these services (Kemp 2024). Partially due to the increasing share of time that users spend on social media, policymakers have raised concerns that these platforms can influence political attitudes and electoral o...
In Addition, The Dominant Social Media Platforms Have Considerable Market
In addition, the dominant social media platforms have considerable market power; as such, it is not clear that market competition can help resolve these policy concerns. Regulators in the EU have implemented several policies to deal with these issues – such as the Digital Markets Act (DMA), the General Data Protection Regulation (GDPR), and the Digital Services Act (DSA) –... How can the research ...
Al 2025). This Provides Policymakers With Meaningful Evidence Only After
al 2025). This provides policymakers with meaningful evidence only after years of implementation and only by evaluating policies that were actually implemented, not counterfactual policies that were considered. Another option is to have platforms explicitly conduct experiments simulating the effects of proposed policy interventions (Guess et. al 2023, Nyhan et. al 2023, Wernerfelt et. al 2025).