The Misinformation Crisis Lifecycle Disa

Bonisiwe Shabane
-
the misinformation crisis lifecycle disa

The Looming Threat of AI-Powered Misinformation and Disinformation The digital age has ushered in an era of unprecedented information access, but this accessibility has also opened the floodgates to a torrent of misinformation and disinformation, posing a significant threat to individuals and... The World Economic Forum’s Global Risk Perception Survey highlights this concern, ranking misinformation and disinformation as the top risk facing people in the next two years. This escalating crisis demands immediate attention from communication professionals, who are now tasked with navigating an increasingly complex and treacherous information landscape. Dave Fleet, managing director and head of global digital crisis at Edelman, addressed this challenge at Ragan’s AI Horizons conference, emphasizing the unique nature of misinformation-driven crises compared to traditional corporate crises. Unlike typical corporate crises that often follow a predictable trajectory of emergence, escalation, peak, and resolution, misinformation crises can be more insidious and persistent.

The deliberate spread of false or misleading information can rapidly proliferate online, fueled by social media algorithms and the echo chambers they create. This can lead to protracted periods of uncertainty, confusion, and erosion of public trust, making effective crisis management significantly more challenging. A key factor contributing to the current surge in misinformation and disinformation is the increasing use of artificial intelligence (AI). While humans have always been capable of spreading falsehoods, AI tools have amplified this capability to an alarming degree. AI can automate the creation and dissemination of misinformation at scale, generating convincing fake text, images, and videos that can easily deceive unsuspecting audiences. This automated process bypasses the traditional constraints of human-generated misinformation, allowing for the rapid and widespread propagation of false narratives.

The lifecycle of a misinformation crisis, as outlined by Fleet, typically involves several distinct stages. It begins with the creation of the false narrative, often designed to exploit existing societal anxieties or biases. This narrative is then amplified through various channels, including social media, online forums, and even mainstream media outlets. As the misinformation gains traction, it begins to influence public opinion and behavior, potentially leading to real-world consequences such as political polarization, social unrest, and even violence. Finally, the crisis may reach a point of critical mass, where the misinformation becomes so pervasive that it is difficult to contain or counteract. And the difference between a misinformation-driven crisis and a corporate crisis.

The World Economic Forum Global Risk Perception Survey revealed misinformation and disinformation as the number one risk to people over the next two years . “We’re facing a real crisis now as communicators in our own information ecosystem. How can we actually convey truth to people when truth is often in the eye of the beholder nowadays?” said Dave Fleet, managing director and head of global digital crisis at Edelman, in a... Here’s a look at how the current misinformation and disinformation landscape is changing due to AI: Fleet reminded communicators that misinformation-driven crises differ from traditional corporate crises. By Dan Stoneking and Ed Conley for Homeland Security

In today’s tumultuous landscape, the rise of misinformation and disinformation during disasters poses a significant challenge to effective crisis communication. The recent response to Hurricane Helene has starkly illustrated how false narratives can exploit the chaos, undermining trust in relief agencies and governmental institutions. As crisis communicators, our role is to deliver timely and accurate information and confront misinformation head-on. We are integral to the process of combating false narratives that jeopardize recovery efforts. The advent of social media has transformed the dissemination of information, allowing misinformation to spread at an alarming rate. To navigate this landscape, crisis communicators must adopt forward-thinking strategies that are not merely reactive but proactive.

These strategies empower us to be prepared and in control. Here are key tactics informed by historical and contemporary examples, including Taiwan’s 2024 presidential elections, the 2023 Maui wildfires, the U.S. Coast Guard’s Deepwater Horizon response, and lessons from FEMA’s rapid response team model. The first step in combating misinformation is anticipating its occurrence. Crisis communicators should recognize that misinformation always arises in disasters and has throughout history—it’s the rule, not the exception. We can prepare and develop forward-leaning strategies by anticipating its emergence, including plans for dedicated teams and dissemination, monitoring, and response protocols.

A dedicated misinformation monitoring team is essential to identify and counter false narratives effectively. This team should comprise experts in social media analysis, crisis communication, and community engagement, and it should be responsible for continuously monitoring various platforms for emerging misinformation trends. For example, Taiwan’s proactive measures during its 2024 elections included a dedicated real-time monitoring team for identifying and debunking false claims (Taiwan Digital Diplomacy Association, 2024). In today’s tumultuous landscape, the rise of misinformation and disinformation during disasters poses a significant challenge to effective crisis communication. The recent response to Hurricane Helene has starkly illustrated how false narratives can exploit the chaos, undermining trust in relief agencies and governmental institutions. As crisis communicators, our role is to deliver timely and accurate information and confront misinformation head-on.

We are integral to the process of combating false narratives that jeopardize recovery efforts. The advent of social media has transformed the dissemination of information, allowing misinformation to spread at an alarming rate. To navigate this landscape, crisis communicators must adopt forward-thinking strategies that are not merely reactive but proactive. These strategies empower us to be prepared and in control. Here are key tactics informed by historical and contemporary examples, including Taiwan’s 2024 presidential elections, the 2023 Maui wildfires, the U.S. Coast Guard’s Deepwater Horizon response, and lessons from FEMA’s rapid response team model.

The first step in combating misinformation is anticipating its occurrence. Crisis communicators should recognize that misinformation always arises in disasters and has throughout history—it’s the rule, not the exception. We can prepare and develop forward-leaning strategies by anticipating its emergence, including plans for dedicated teams and dissemination, monitoring, and response protocols. A dedicated misinformation monitoring team is essential to identify and counter false narratives effectively. This team should comprise experts in social media analysis, crisis communication, and community engagement, and it should be responsible for continuously monitoring various platforms for emerging misinformation trends. For example, Taiwan’s proactive measures during its 2024 elections included a dedicated real-time monitoring team for identifying and debunking false claims (Taiwan Digital Diplomacy Association, 2024).

Once misinformation is detected, it is crucial to analyze its origins, types, and potential impacts on the community. By triaging the most misleading narratives, we can prioritize responses. Not all misinformation poses an equal risk; focusing resources on narratives that could lead to significant public harm is essential. Combating the Infodemic: How Misinformation Hinders Crisis Response and What Communicators Can Do In today’s interconnected world, crises often unfold simultaneously in the physical realm and the digital landscape. While emergency response teams grapple with the immediate dangers of natural disasters, pandemics, or other emergencies, a parallel battle is waged against the rapid spread of misinformation and disinformation.

This "infodemic," as it has been termed, can exacerbate the real-world consequences of a crisis, impeding relief efforts and sowing confusion among affected populations. Justin Ángel Knighten, former associate administrator in the Office of External Affairs at FEMA, brings firsthand experience to this critical issue, having witnessed the detrimental impact of misinformation during responses to events like Hurricanes... He emphasizes the urgent need for proactive communication strategies to counter the spread of false information and ensure that accurate, life-saving guidance reaches those who need it most. Knighten’s experiences illustrate the challenges faced by emergency management agencies in the digital age. During the hurricanes, FEMA struggled to disseminate vital evacuation information amidst a torrent of false and misleading content circulating online. The speed and scale of misinformation dissemination, often fueled by automated bots and malicious actors, overwhelmed traditional communication channels.

The resulting confusion and distrust hampered evacuation efforts and delayed the delivery of essential aid. Knighten stresses that the proliferation of AI-driven content poses a significant threat, enabling the rapid creation and dissemination of fabricated stories, manipulated images, and deceptive narratives. The ability of these AI tools to mimic human communication makes it increasingly difficult to distinguish between credible sources and malicious actors. One crucial lesson learned from these experiences is the necessity of proactive communication strategies. Waiting for misinformation to spread before responding puts organizations on the defensive and makes it harder to regain control of the narrative. Instead, communicators need to anticipate potential sources of misinformation and develop preemptive strategies to address them.

This includes building strong relationships with trusted media outlets, engaging with communities through social media platforms, and establishing clear channels for disseminating verified information. Proactive communication also involves educating the public about how to identify misinformation and encouraging critical thinking skills. By empowering individuals to discern fact from fiction, we can collectively build resilience against the spread of harmful narratives. Another crucial element in combating misinformation is effective monitoring. Organizations need to actively track online conversations, social media trends, and emerging narratives related to their area of expertise. This allows them to identify potential sources of misinformation early on and develop targeted responses.

Monitoring can involve using social listening tools, tracking relevant hashtags, and engaging with online communities. By staying attuned to the information landscape, organizations can anticipate potential crises and prepare effective communication strategies in advance. .page-hero-full .video_external { position: absolute; top: 0; right: 0; bottom: 0; left: 0; background: rgba(0,0,0,.2); } .page-hero-full video { z-index: -1; } Information pollution is affecting the citizens’ capacity to make informed decisions. Disinformation, misinformation, and mal-information together with the growth of hate speech and propaganda, especially online, are inciting social divisions and creating mistrust in public institutions. In the past decade, significant resources by international development partners have been invested in tackling this growing global phenomenon that is also negatively affecting social cohesion in the region.

Through different and numerous examples of government-led and independent responses to information pollution, societies in the region are showing that they have recognized disinformation to be a serious threat to their countries’ social, political... To raise awareness and understanding of information pollution as an important contributory factor to the growing security threats and development challenges in the Europe and Central Asia region, UNDP's Istanbul Regional Hub in collaboration... .page-hero-full .video_external { position: absolute; top: 0; right: 0; bottom: 0; left: 0; background: rgba(0,0,0,.2); } .page-hero-full video { z-index: -1; } - Political Narratives - Hate Speech - Gendered Disinformation The United States National Security Strategy recognizes the need to combat misinformation and disinformation to employ integrated deterrence successfully. The Air Force Culture and Language Center addresses that priority through an educational video series on its Culture Guide app focused on helping total force Airmen and the Department of Defense develop resilience to...

“Strategic competitors like Russia and China, as well as Violent Extremist Organizations and non-political disrupters, use misinformation and disinformation campaigns to recruit members to their cause, divide our society domestically, and create rifts between... Elizabeth Peifer, AFCLC’s Associate Professor of Regional and Cultural Studies (Europe). “We are less able to put up a strong defense if we are divided socially and if our alliances and partnerships are torn.” Dr. Peifer’s academic interests include radicalism and extremism, public memory and narrative, and European security issues. These diverse research areas converge on the problem of disinformation and its impact on military operations.

Her study of factors that make disinformation effective led to developing this series as a constructive and practical approach to the problem. The new four-part video series discusses vulnerabilities to misinformation and disinformation in the military setting and innovative tools and techniques for service members to detect, evaluate, and combat manipulative information to make informed decisions... Part one of the series provides an overview of the problem of disinformation and misinformation. Part two promotes greater self-awareness by analyzing aspects of cognitive behavior and social psychology that make individuals more susceptible to manipulation through disinformation. Part three helps viewers gain situational awareness with an understanding of the Internet and patterns of disinformation in the digital landscape. And finally, part four concludes the series with practical ways to protect against misinformation and disinformation with tools and techniques for evaluating online sources.

People Also Search

The Looming Threat Of AI-Powered Misinformation And Disinformation The Digital

The Looming Threat of AI-Powered Misinformation and Disinformation The digital age has ushered in an era of unprecedented information access, but this accessibility has also opened the floodgates to a torrent of misinformation and disinformation, posing a significant threat to individuals and... The World Economic Forum’s Global Risk Perception Survey highlights this concern, ranking misinformatio...

The Deliberate Spread Of False Or Misleading Information Can Rapidly

The deliberate spread of false or misleading information can rapidly proliferate online, fueled by social media algorithms and the echo chambers they create. This can lead to protracted periods of uncertainty, confusion, and erosion of public trust, making effective crisis management significantly more challenging. A key factor contributing to the current surge in misinformation and disinformation...

The Lifecycle Of A Misinformation Crisis, As Outlined By Fleet,

The lifecycle of a misinformation crisis, as outlined by Fleet, typically involves several distinct stages. It begins with the creation of the false narrative, often designed to exploit existing societal anxieties or biases. This narrative is then amplified through various channels, including social media, online forums, and even mainstream media outlets. As the misinformation gains traction, it b...

The World Economic Forum Global Risk Perception Survey Revealed Misinformation

The World Economic Forum Global Risk Perception Survey revealed misinformation and disinformation as the number one risk to people over the next two years . “We’re facing a real crisis now as communicators in our own information ecosystem. How can we actually convey truth to people when truth is often in the eye of the beholder nowadays?” said Dave Fleet, managing director and head of global digit...

In Today’s Tumultuous Landscape, The Rise Of Misinformation And Disinformation

In today’s tumultuous landscape, the rise of misinformation and disinformation during disasters poses a significant challenge to effective crisis communication. The recent response to Hurricane Helene has starkly illustrated how false narratives can exploit the chaos, undermining trust in relief agencies and governmental institutions. As crisis communicators, our role is to deliver timely and accu...