What Interventions Can Be Used To Counter Misinformation Effectively

Bonisiwe Shabane
-
what interventions can be used to counter misinformation effectively

A high-level, evidence-informed guide to some of the major proposals for how democratic governments, platforms, and others can counter disinformation. The Technology and International Affairs Program develops insights to address the governance challenges and large-scale risks of new technologies. Our experts identify actionable best practices and incentives for industry and government leaders on artificial intelligence, cyber threats, cloud security, countering influence operations, reducing the risk of biotechnologies, and ensuring global digital inclusion. The goal of the Partnership for Countering Influence Operations (PCIO) is to foster evidence-based policymaking to counter threats in the information environment. Key roadblocks as found in our work include the lack of: transparency reporting to inform what data is available for research purposes; rules guiding how data can be shared with researchers and for what... Carnegie’s Information Environment Project is a multistakeholder effort to help policymakers understand the information environment, think through the impact of efforts to govern it, and identify promising interventions to foster democracy.

Disinformation is widely seen as a pressing challenge for democracies worldwide. Many policymakers are grasping for quick, effective ways to dissuade people from adopting and spreading false beliefs that degrade democratic discourse and can inspire violent or dangerous actions. Yet disinformation has proven difficult to define, understand, and measure, let alone address. Received 2023 May 26; Accepted 2024 Apr 10; Issue date 2024. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit... The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material.

If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from... To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. Current interventions to combat misinformation, including fact-checking, media literacy tips and media coverage of misinformation, may have unintended consequences for democracy. We propose that these interventions may increase scepticism towards all information, including accurate information. Across three online survey experiments in three diverse countries (the United States, Poland and Hong Kong; total n = 6,127), we tested the negative spillover effects of existing strategies and compared them with three... We examined how exposure to fact-checking, media literacy tips and media coverage of misinformation affects individuals’ perception of both factual and false information, as well as their trust in key democratic institutions.

Our results show that while all interventions successfully reduce belief in false information, they also negatively impact the credibility of factual information. This highlights the need for further improved strategies that minimize the harms and maximize the benefits of interventions against misinformation. Subject terms: Politics and international relations; Cultural and media studies; Science, technology and society This study reveals that current interventions against misinformation erode belief in accurate information. The authors argue that future strategies should shift their focus from only fighting falsehoods to also nurturing trust in reliable news. The Persistent Threat of Misinformation: A Deep Dive into Strategies for Mitigation

In an era defined by instantaneous information dissemination, the proliferation of misinformation poses a significant threat to individual well-being, public health, and the very fabric of democratic societies. The ease with which false or misleading information can be created and shared across digital platforms necessitates a multi-pronged approach involving individuals, communities, tech companies, educational institutions, and governments to combat this pervasive issue. This article explores the multifaceted nature of misinformation and delves into specific strategies for mitigating its harmful effects. Understanding the Landscape of Misinformation: Misinformation, often confused with disinformation (which is intentionally deceptive), encompasses false or inaccurate information regardless of the intent behind its creation or dissemination. The rapid spread of misinformation is fueled by several factors, including the algorithmic amplification of sensational content on social media, the erosion of trust in traditional media outlets, and the increasing sophistication of techniques...

These factors contribute to echo chambers and filter bubbles, where individuals are primarily exposed to information that confirms their pre-existing beliefs, further reinforcing the acceptance of misinformation. Empowering Individuals: Critical Thinking and Media Literacy: AI makes it easier to create disinformation, false or decontextualized content, and to spread it quickly through existing channels. (Photo: Canva) In an information ecosystem where misinformation circulates faster than fact-checkers can respond, increasingly precise and efficient tools are needed to verify content, detect hoaxes and understand how false narratives spread. The following list brings together five tools that media outlets and fact-checking organizations use for tasks ranging from tracking disinformation and analyzing its dissemination patterns, to recovering deleted content and analyzing audiovisual material.

Fact Check Explorer allows users to insert a phrase, piece of data or a link to check if someone has already verified it. (Photo: Screenshot) Google has developed an ecosystem of fact-checking tools, some for fact-checkers specifically and others for the general public. The flagship tool is Fact Check Explorer, a specialized search engine that compiles claim reviews from multiple fact-checking organizations worldwide, including Chequeado (Argentina), Bolivia Verifica (Bolivia), El Sabueso (Mexico) and Cotejo.info (Venezuela). Associate professor in English and sociolinguistics, Malmö University Shaun Nolan does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Malmö University provides funding as a member of The Conversation UK. A fake photo of an explosion near the Pentagon once rattled the stock market. A tearful video of a frightened young “Ukrainian conscript” went viral: until exposed as staged. We may be approaching a “synthetic media tipping point”, where AI-generated images and videos are becoming so realistic that traditional markers of authenticity, such as visual flaws, are rapidly disappearing. In 2025, 70% of people struggle to trust online information, and 64% fear AI-generated content could influence elections. We are entering an era where seeing is no longer believing.

These international instances demonstrate the potency and adaptability of creative remediation techniques in dealing with POP pollution. They demonstrate the possibility for cleaner and healthier ecosystems on a global scale by providing insightful information on the flexibility and beneficial effects of various strategies (Blair et al., 2023). The lessons learnt and ongoing problems of combating Persistent Organic Pollutants (POPs) in soil and water must be examined in the thorough examination of new remediation solutions. Still, many open questions about inoculation research in the context of misinformation remain, including the lack of cross-cultural research, conceptual confusion around the term “prebunking”, potentially undesirable side effects of the inoculation treatment, as... The overwhelming majority of inoculation and misinformation research (Badrinathan & Chauchard, 2023; Blair et al., 2023) has been conducted in predominantly Western, Educated, Industrialized, and Rich Democracies (WEIRD). In the context of misinformation this seems especially prudent given that citizens in other parts of the world likely have very different relationships and histories with (state-controlled) media, propaganda, and censorship suggesting there could...

Posted November 23, 2025 | Reviewed by Gary Drevitch How many Trump administration executive orders, policy announcements, or social media blasts have you heard about this week? Can you even begin to name them all? U.S. President Donald Trump and his administration have been said to engage in a strategy called “flooding the zone”—releasing a great deal of information with the goal of distracting the media and the public. (Almost certainly, they are not the only politicians to do this.

For example, Boris Johnson’s London mayoral campaigns were said to use the “dead cat strategy,” shocking the public with an announcement to distract them from news they preferred they not see.) The U.S. political application of this term, which was borrowed from the name of a tactic used in American football, can be traced to former Trump strategist Steve Bannon, who said, “All we have to do... They'll bite on one, and we'll get all of our stuff done, bang, bang, bang.” Flooding the zone might work as a political strategy, but it takes a psychological toll on media consumers. For example, polls show that 65% of U.S. adults have felt the need to reduce their media consumption because of information overload and ensuing feelings of fatigue.

Moreover, experimental research has found that a habit of closely following political news is a chronic stressor, often leading to negative emotions (Ford et al., 2023). But information overload doesn’t just undermine our psychological well-being; it can also undermine democracy. In a recent article, “Critical ignoring when information abundance is detrimental to democracy,” psychology researchers Stephan Lewandowsky and Ralph Hertwig (2025) outlined why information overload harms democracy and provided a strategy on how we... First, the authors share findings that information abundance causes misinformation because our ability to differentiate truth from falsehood decreases when we are overwhelmed and in a hurry. In fact, the research shows that overwhelmed people are more likely to share “things that are partially or completely untrue.” Why? Essentially, we’re more likely to share the splashy findings which are, in turn, more likely to be misinformation.

As the researchers explain, this information abundance harms democracy via several mechanisms, ranging from “triggering misinformation cascades to generating coping strategies that result in reduced political accountability.” Kwaku Krobea Asante (right) presents information on fact-checking during Ghana's recent election, at a panel on fighting electoral disinformation campaigns at GIJC25. Image: Samsul Said, Alt Studio for GIJN Two AI-powered news anchors introduced themselves to reporters at the 14th Global Investigative Journalism Conference (GIJC25) in Malaysia. They were created not to spread disinformation, but to fight it. “They (our editors) wanted to leverage the trust that the media outlets were willing to place in two aviators to disseminate the content,” one of the AI anchors said in a Venezuelan accent, explaining...

Created by CONNECTAS, a Latin American investigative journalism network, AI-generated avatars La Chama and El Pana presented news to combat President Nicolás Maduro’s media crackdown during the 2024 presidential election. It was a response to the extreme hostility and censorship that journalists faced in the country. And it was one of the three projects showcased at the GIJC25 panel. At a time when AI-powered disinformation is spreading like wildfire on the internet, Carlos Eduardo Huertas, director of CONNECTAS, was showing ways to use the same technology to combat it. And the stakes in this battle are very real. The World Economic Forum’s “Global Risks Report 2025” identifies misinformation and disinformation as the top short-term risk over the next two years.

People Also Search

A High-level, Evidence-informed Guide To Some Of The Major Proposals

A high-level, evidence-informed guide to some of the major proposals for how democratic governments, platforms, and others can counter disinformation. The Technology and International Affairs Program develops insights to address the governance challenges and large-scale risks of new technologies. Our experts identify actionable best practices and incentives for industry and government leaders on a...

Disinformation Is Widely Seen As A Pressing Challenge For Democracies

Disinformation is widely seen as a pressing challenge for democracies worldwide. Many policymakers are grasping for quick, effective ways to dissuade people from adopting and spreading false beliefs that degrade democratic discourse and can inspire violent or dangerous actions. Yet disinformation has proven difficult to define, understand, and measure, let alone address. Received 2023 May 26; Acce...

If Material Is Not Included In The Article’s Creative Commons

If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from... To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. Current interventions to combat misinformation, including fact-checking, media literacy tips and media...

Our Results Show That While All Interventions Successfully Reduce Belief

Our results show that while all interventions successfully reduce belief in false information, they also negatively impact the credibility of factual information. This highlights the need for further improved strategies that minimize the harms and maximize the benefits of interventions against misinformation. Subject terms: Politics and international relations; Cultural and media studies; Science,...

In An Era Defined By Instantaneous Information Dissemination, The Proliferation

In an era defined by instantaneous information dissemination, the proliferation of misinformation poses a significant threat to individual well-being, public health, and the very fabric of democratic societies. The ease with which false or misleading information can be created and shared across digital platforms necessitates a multi-pronged approach involving individuals, communities, tech compani...