Combating Misinformation Strategies For Resistance And Mitigation
The Contagion of Falsehoods: How Misinformation Spreads Like a Virus and What We Can Do About It In today’s interconnected world, where information flows at an unprecedented rate, the spread of misinformation poses a significant threat to individuals and society as a whole. Shaon Lahiri, an assistant professor in the College of Charleston Department of Public Health Sciences and Administration, likens the spread of misinformation to a viral contagion, exploiting social connections and emotional vulnerabilities to propagate... Just as a virus can infect and spread through a population, misinformation can insidiously infiltrate our minds, shaping our beliefs, behaviors, and ultimately, the very fabric of our social reality. Lahiri’s research delves into the intricate dynamics of social influence, exploring how individual behaviors are shaped by prevailing social norms and network processes. His work draws striking parallels between the spread of misinformation and the phenomenon of mass psychogenic illness, where physical symptoms spread through social networks in the absence of an organic cause.
Although distinct phenomena, both misinformation and mass psychogenic illness highlight the powerful influence of social connections in disseminating beliefs and behaviors, whether accurate or unfounded. This interconnectedness creates pathways for rapid transmission, allowing ideas, both true and false, to quickly permeate communities. The analogy to viral transmission provides a powerful framework for understanding the mechanisms by which misinformation spreads. Just as viruses exploit vulnerabilities in our immune systems, misinformation preys on our cognitive biases, emotional vulnerabilities, and the inherent trust we place in our social networks. Lahiri emphasizes that our susceptibility to misinformation isn’t merely a product of individual gullibility; rather, it’s a consequence of the complex interplay between individual psychology and the social environment. We are wired to trust information shared by those within our social circles, making us particularly vulnerable to misinformation propagated by friends, family, and trusted influencers.
The insidious nature of misinformation lies in its ability to mimic truth, often cloaking fabricated narratives in the guise of credible sources or emotionally charged appeals. Sensationalized headlines, manipulated images, and selective presentation of facts can all contribute to the spread of false information. This “infodemic” poses a serious threat to public health, influencing everything from vaccine hesitancy and harmful health practices to political polarization and social unrest. The consequences of misinformation can be far-reaching, eroding trust in institutions, fueling conflict, and hindering our ability to address critical societal challenges. It is no secret that the internet, a hub for innovation and connection, has also become a fertile ground for misinformation. From cleverly disguised clickbait to weaponized social media campaigns, untruths spread faster than a virtual wildfire.
Every day, countless misleading posts flood our social feeds, challenging the fabric of democracy and directly affecting fields and industries, most notably public health. Fighting misinformation is not an easy task. The ever-increasing polarization and greater sophistication of bad actors make it a formidable foe.1,2 These bad actors within the disinformation network can range from state-sponsored troll farms spreading propaganda to political operatives manipulating public... It’s no surprise that the rise of generative AI has escalated the spread of disinformation and propaganda as never seen before.4 But all hope is not lost. We must arm ourselves—not with pitchforks and torches, but with a far more potent weapon: evidence-based interventions.
A study by a global team of 30 misinformation researchers led by Dr. Anastasia Kozyreva offers a much-needed "toolbox" of strategies empowering individuals to cut through the noise.5 Here's your guide to understanding and utilizing these tools effectively, categorized by their primary aim: influencing behaviors (nudges), boosting competencies (boosts), or directly targeting beliefs (refutation). Targeted, local engagement with communities coupled with civic education are effective strategies to strengthen information ecosystems, alongside national and international efforts focused on laws and regulation. By Gabriel Marmentini & Jeanine Abrams McLean Sep. 16, 2024
Over the last decade, democracies around the world have seen the steady decay of civic trust, the rise of hyperpolarization, the growth of cooperation among authoritarian powers, and a host of ever-evolving threats to... Misinformation (misleading information) and disinformation (deliberately false information) impede the informed decision-making of voters, but also undermine and erode trust in the media, government, and electoral processes. As countries navigate these challenges, grassroots strategies in combating the spread and influence of harmful and inaccurate information have proven to be a vital and effective complement to policy strategies and interventions. Effective grassroots organizations are trusted voices that are uniquely positioned to identify misinformation that impacts the communities they serve, to implement strategies to combat misinformation, and to build communities that are resilient to future... These strategies include: (1) using education (civic, democratic, and media) to strengthen information ecosystems; (2) developing long-term civil society coalitions for fact-checking and community building; and (3) conducting localized community engagement activities and amplify... By leveraging direct connections with communities, grassroots efforts complement and enhance the impact and efficacy of policy-based tactics designed to strengthen information ecosystems, during and between election cycles.
A high-level, evidence-informed guide to some of the major proposals for how democratic governments, platforms, and others can counter disinformation. The Technology and International Affairs Program develops insights to address the governance challenges and large-scale risks of new technologies. Our experts identify actionable best practices and incentives for industry and government leaders on artificial intelligence, cyber threats, cloud security, countering influence operations, reducing the risk of biotechnologies, and ensuring global digital inclusion. The goal of the Partnership for Countering Influence Operations (PCIO) is to foster evidence-based policymaking to counter threats in the information environment. Key roadblocks as found in our work include the lack of: transparency reporting to inform what data is available for research purposes; rules guiding how data can be shared with researchers and for what... Carnegie’s Information Environment Project is a multistakeholder effort to help policymakers understand the information environment, think through the impact of efforts to govern it, and identify promising interventions to foster democracy.
Disinformation is widely seen as a pressing challenge for democracies worldwide. Many policymakers are grasping for quick, effective ways to dissuade people from adopting and spreading false beliefs that degrade democratic discourse and can inspire violent or dangerous actions. Yet disinformation has proven difficult to define, understand, and measure, let alone address. Insights from PR Daily’s Media Relations Conference. Patrice Smith is a lecturer in the Department of Journalism & Public Relations at California State University, Long Beach. Connect with her on LinkedIn or follow her on X.
As lies and rumors spread across the internet, the terms “misinformation” and “disinformation” have become part of communicators’ lexicon. Although many professionals and consumers attribute this issue to the advancement of technology such as social and AI tools, the Public Relations Society of America (PRSA)’s special report, “Tackling Misinformation: The Communications Industry Unites,”... According to the PRSA special report, the terms misinformation, disinformation and malinformation are defined as: As public relations and communications professionals, we must uphold high ethical standards to effectively combat misinformation by adhering to the PRSA code of ethics. 1Division of Human Nutrition Unit, Department of Food and Drugs, University of Parma, Parma, Italy 2Laboratory for Industrial and Applied Mathematics, Department of Mathematics and Statistics, York University, Toronto, ON, Canada
3Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics and Maternal/Child Sciences, University of Genoa, Genoa, Italy Misinformation represents an evolutionary paradox: despite its harmful impact on society, it persists and evolves, thriving in the information-rich environment of the digital age. This paradox challenges the conventional expectation that detrimental entities should diminish over time. The persistence of misinformation, despite advancements in fact-checking and verification tools, suggests that it possesses adaptive qualities that enable it to survive and propagate. This paper explores how misinformation, as a blend of truth and fiction, continues to resonate with audiences. The role of narratives in human history, particularly in the evolution of Homo narrans, underscores the enduring influence of storytelling on cultural and social cohesion.
Despite the increasing ability of individuals to verify the accuracy of sources, misinformation remains a significant challenge, often spreading rapidly through digital platforms. Current behavioral research tends to treat misinformation as completely irrational, static, finite entities that can be definitively debunked, overlooking their dynamic and evolving nature. This approach limits our understanding of the behavioral and societal factors driving the transformation of misinformation over time. The persistence of misinformation can be attributed to several factors, including its role in fostering social cohesion, its perceived short-term benefits, and its use in strategic deception. Techniques such as extrapolation, intrapolation, deformation, cherry-picking, and fabrication contribute to the production and spread of misinformation. Understanding these processes and the evolutionary advantages they confer is crucial for developing effective strategies to counter misinformation.
By promoting transparency, critical thinking, and accurate information, society can begin to address the root causes of misinformation and create a more resilient information environment. Some stories are unbelievable, yet they can still convince people because, “in substance,” they are true, even though only some details are real and the rest is false. This is the conclusion of “Emma Zunz,” a short story by Argentine writer Jorge Luis Borges (1899-1986) [1], which illustrates how truth and falsehood can blend (“true lies” and “false truths”) [2] and how... “Emma Zunz” demonstrates how narratives, even when not entirely factual or even completely fabricated, can be perceived as “essentially true,” being powerful, persuasive, and impactful as long as they resonate with the audience [3]. The framework to combat online misinformation provides a strategic roadmap to address the growing challenge of false and misleading information in an interconnected world. Misinformation threatens public health, democracy, social harmony, and economic stability, requiring proactive and coordinated efforts.
The framework integrates global imperatives with localized solutions to build a resilient information ecosystem that fosters trust and informed decision-making. While achieving zero misinformation is the ultimate goal, challenges such as evolving tactics, digital literacy gaps, and content regulation complexities necessitate continuous adaptation and collaboration. At its core, the framework is built on seven pillars: (1) clear definitions and scope, (2) cultural context and sensitivity, (3) legal framework and ethical balance, (4) education and empowerment, (5) technological innovation, (6)... These pillars reflect the dynamic interplay between global standards and local adaptations, providing a structured approach to counter misinformation effectively. Each pillar is underpinned by carefully defined dimensions that offer actionable guidance tailored to address specific challenges and leverage unique opportunities. The framework’s dimensions delve deeper into the pillars, addressing critical components such as digital literacy, technological advancements, legal safeguards, and cultural adaptations to counter misinformation effectively.
To operate the framework, a set of concrete actions is proposed. Concrete actions include fact-checking tools, education initiatives, public-private partnerships, and rapid response mechanisms. By leveraging diverse perspectives and measurable benchmarks, the framework equips stakeholders with an adaptive toolkit to combat misinformation and strengthen the integrity of the digital information landscape. You have full access to this open access article A Correction to this article was published on 31 May 2025 This article addresses the critical issue of societal resilience in the face of disinformation, particularly in highly digitized democratic societies.
Recognizing the escalating impact of disinformation as a significant threat to societal security, the study conducts a scoping review of the literature from 2018 to 2022 to explore the current understanding and approaches to... The core contribution of the article is the development of a preliminary typological framework that addresses key elements and issue areas relevant to societal resilience to disinformation. This framework spans multiple dimensions, including legal/regulatory, educational, political/governance, psychological/social-psychological, and technological domains. By synthesizing existing knowledge and filling identified gaps, the framework aims to serve as a foundational tool for empirical analyses and the enhancement of resilience strategies. One of the innovative aspects of the proposed framework is its potential to be transformed into a computable and customizable tool. This tool would measure the maturity level of various countermeasures against disinformation, thereby providing a practical methodology for planning and implementing effective democratic responses to disinformation.
The article emphasizes the importance of this framework as both a conceptual and practical guide. It offers valuable insights for a wide range of civil society actors, including policymakers, educators, and technologists, in their efforts to protect information integrity and bolster societal resilience. By laying the groundwork for a more comprehensive understanding of societal resilience to disinformation, the article contributes to the broader discourse on information protection and provides actionable guidance for addressing the evolving challenges posed... Avoid common mistakes on your manuscript. In highly digitized democratic societies, there are growing concerns about the impact of disinformation. European countries have faced significant challenges related to disinformation issues, such as elections, fundamental democratic values, pandemics, and migration, among others.
This has prompted the European Commission (2018a) to outline four pillars in its action plan on tackling online disinformation. These include improving institutional capabilities, fostering coordinated responses, engaging the private sector, and raising awareness. The Digital Services Act (European Union 2022) has binding regulatory powers to address large social media platforms, for instance, if they are deemed to be promoting and disseminating disinformation. However, the effectiveness of these measures remains uncertain. The Global Risks Report 2024 by the World Economic Forum (2024) clearly illustrates this, ranking disinformation as the most severe short-term (2-year) global risk and placing societal polarization in third place – with both...
People Also Search
- Combating Misinformation: Strategies for Resistance and Mitigation - DISA
- The Misinformation Mitigation Toolbox: Dismantling the Digital Deceit
- Recommendations for countering misinformation
- Grassroots Strategies to Combat Election-Related Misinformation
- Countering Disinformation Effectively: An Evidence-Based Policy Guide ...
- 5 strategies for combatting misinformation, disinformation and ...
- Understanding and Combating Misinformation: An Evolutionary Perspective
- Counter disinformation toolkit: Strategic communications to reduce the ...
- Combating Online Misinformation: A Comprehensive Framework for ...
- Combatting Disinformation - How Do We Create Resilient Societies ...
The Contagion Of Falsehoods: How Misinformation Spreads Like A Virus
The Contagion of Falsehoods: How Misinformation Spreads Like a Virus and What We Can Do About It In today’s interconnected world, where information flows at an unprecedented rate, the spread of misinformation poses a significant threat to individuals and society as a whole. Shaon Lahiri, an assistant professor in the College of Charleston Department of Public Health Sciences and Administration, li...
Although Distinct Phenomena, Both Misinformation And Mass Psychogenic Illness Highlight
Although distinct phenomena, both misinformation and mass psychogenic illness highlight the powerful influence of social connections in disseminating beliefs and behaviors, whether accurate or unfounded. This interconnectedness creates pathways for rapid transmission, allowing ideas, both true and false, to quickly permeate communities. The analogy to viral transmission provides a powerful framewo...
The Insidious Nature Of Misinformation Lies In Its Ability To
The insidious nature of misinformation lies in its ability to mimic truth, often cloaking fabricated narratives in the guise of credible sources or emotionally charged appeals. Sensationalized headlines, manipulated images, and selective presentation of facts can all contribute to the spread of false information. This “infodemic” poses a serious threat to public health, influencing everything from...
Every Day, Countless Misleading Posts Flood Our Social Feeds, Challenging
Every day, countless misleading posts flood our social feeds, challenging the fabric of democracy and directly affecting fields and industries, most notably public health. Fighting misinformation is not an easy task. The ever-increasing polarization and greater sophistication of bad actors make it a formidable foe.1,2 These bad actors within the disinformation network can range from state-sponsore...
A Study By A Global Team Of 30 Misinformation Researchers
A study by a global team of 30 misinformation researchers led by Dr. Anastasia Kozyreva offers a much-needed "toolbox" of strategies empowering individuals to cut through the noise.5 Here's your guide to understanding and utilizing these tools effectively, categorized by their primary aim: influencing behaviors (nudges), boosting competencies (boosts), or directly targeting beliefs (refutation). T...