Misunderstanding The Harms Of Online Misinformation Nature X Mol

Bonisiwe Shabane
-
misunderstanding the harms of online misinformation nature x mol

Nature volume 630, pages 45–53 (2024)Cite this article The controversy over online misinformation and social media has opened a gap between public discourse and scientific research. Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence. Here we identify three common misperceptions: that average exposure to problematic content is high, that algorithms are largely responsible for this exposure and that social media is a primary cause of broader social problems... In our review of behavioural science research on online misinformation, we document a pattern of low exposure to false and inflammatory content that is concentrated among a narrow fringe with strong motivations to seek... In response, we recommend holding platforms accountable for facilitating exposure to false and extreme content in the tails of the distribution, where consumption is highest and the risk of real-world harm is greatest.

We also call for increased platform transparency, including collaborations with outside researchers, to better evaluate the effects of online misinformation and the most effective responses to it. Taking these steps is especially important outside the USA and Western Europe, where research and data are scant and harms may be more severe. This is a preview of subscription content, access via your institution Access Nature and 54 other Nature Portfolio journals Get Nature+, our best-value online-access subscription The controversy over online misinformation and social media has opened a gap between public discourse and scientific research.

Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence. Here we identify three common misperceptions: that average exposure to problematic content is high, that algorithms are largely responsible for this exposure and that social media is a primary cause of broader social problems... In our review of behavioural science research on online misinformation, we document a pattern of low exposure to false and inflammatory content that is concentrated among a narrow fringe with strong motivations to seek... In response, we recommend holding platforms accountable for facilitating exposure to false and extreme content in the tails of the distribution, where consumption is highest and the risk of real-world harm is greatest. We also call for increased platform transparency, including collaborations with outside researchers, to better evaluate the effects of online misinformation and the most effective responses to it. Taking these steps is especially important outside the USA and Western Europe, where research and data are scant and harms may be more severe.

关于网络错误信息和社交媒体的争议拉开了公共话语和科学研究之间的鸿沟。公共知识分子和记者经常对网上虚假内容的影响提出笼统的说法,这些说法与当前的许多经验证据不一致。在这里,我们确定了三种常见的误解:问题内容的平均曝光率很高,算法是造成这种曝光的主要原因,社交媒体是两极分化等更广泛的社会问题的主要原因。在我们对在线错误信息的行为科学研究的回顾中,我们记录了一种低接触虚假和煽动性内容的模式,这些内容集中在有强烈动机寻找此类信息的狭窄边缘。作为回应,我们建议让平台承担责任,以促进在发行尾部暴露虚假和极端内容,因为这些内容的消费最高,现实世界危害的风险也最大。我们还呼吁提高平台透明度,包括与外部研究人员合作,以更好地评估在线错误信息的影响以及对其采取最有效的应对措施。在美国和西欧以外地区采取这些措施尤其重要,因为这些地方的研究和数据很少,而且危害可能更严重。 Misunderstanding The Harms Of Online Misinformation Budak, Ceren; Nyhan, Brendan; Rothschild, David M.; Thorson, Emily; Watts, Duncan J. Covid-19 Misinformation, Fake News and Politicization, Information Freedom, Russian Trolls, Social Media and Politics, Twitter Research Sign-up to our newsletter to be informed about latest developments: our Unpacking Current Developments in the Information Space Insight Series, our newsletter, news from our network, events and publications. 2025 - Observatory on Information & Democracy

The controversy over online misinformation and social media has opened a gap between public discourse and scientific research. Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence. Here we identify three common misperceptions: that average exposure to problematic content is high, that algorithms are largely responsible for this exposure and that social media is a primary cause of broader social problems... In our review of behavioural science research on online misinformation, we document a pattern of low exposure to false and inflammatory content that is concentrated among a narrow fringe with strong motivations to seek... In response, we recommend holding platforms accountable for facilitating exposure to false and extreme content in the tails of the distribution, where consumption is highest and the risk of real-world harm is greatest. We also call for increased platform transparency, including collaborations with outside researchers, to better evaluate the effects of online misinformation and the most effective responses to it.

Taking these steps is especially important outside the USA and Western Europe, where research and data are scant and harms may be more severe. Ceren Budak, Brendan Nyhan, David M. Rothschild, Emily Thorson & Duncan J. Watts in Nature: The controversy over online misinformation and social media has opened a gap between public discourse and scientific research. Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence.

Here we identify three common misperceptions: that average exposure to problematic content is high, that algorithms are largely responsible for this exposure and that social media is a primary cause of broader social problems... In our review of behavioural science research on online misinformation, we document a pattern of low exposure to false and inflammatory content that is concentrated among a narrow fringe with strong motivations to seek... In response, we recommend holding platforms accountable for facilitating exposure to false and extreme content in the tails of the distribution, where consumption is highest and the risk of real-world harm is greatest. We also call for increased platform transparency, including collaborations with outside researchers, to better evaluate the effects of online misinformation and the most effective responses to it. Taking these steps is especially important outside the USA and Western Europe, where research and data are scant and harms may be more severe. In 2006, Facebook launched its News Feed feature, sparking seemingly endless contentious public discourse on the power of the “social media algorithm” in shaping what people see online.

Nearly two decades and many recommendation algorithm tweaks later, this discourse continues, now laser-focused on whether social media recommendation algorithms are primarily responsible for exposure to online misinformation and extremist content. Researchers at the Computational Social Science Lab (CSSLab) at the University of Pennsylvania, led by Duncan Watts, Stevens University Professor in Computer and Information Science (CIS), study Americans’ news consumption. In a new article in Nature, Watts, along with David Rothschild of Microsoft Research (Wharton Ph.D. ‘11 and PI in the CSSLab), Ceren Budak of the University of Michigan, Brendan Nyhan of Dartmouth College and Annenberg alumnus Emily Thorson (Ph.D. ’13) of Syracuse University, review years of behavioral science research on exposure to false and radical content online and find that exposure to harmful and false information on social media is minimal to all... “The research shows that only a small fraction of people are exposed to false and radical content online,” says Rothschild, “and that it’s personal preferences, not algorithms that lead people to this content.

The people who are exposed to false and radical content are those who seek it out.” Misunderstanding the harms of online misinformation "Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence. Here we identify three common misperceptions: that average exposure to problematic content is high, that algorithms are largely responsible for this exposure and that social media is a primary cause of broader social problems... In our review of behavioural science research on online misinformation, we document a pattern of low exposure to false and inflammatory content that is concentrated among a narrow fringe with strong motivations to seek...

People Also Search

Nature Volume 630, Pages 45–53 (2024)Cite This Article The Controversy

Nature volume 630, pages 45–53 (2024)Cite this article The controversy over online misinformation and social media has opened a gap between public discourse and scientific research. Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence. Here we identify three commo...

We Also Call For Increased Platform Transparency, Including Collaborations With

We also call for increased platform transparency, including collaborations with outside researchers, to better evaluate the effects of online misinformation and the most effective responses to it. Taking these steps is especially important outside the USA and Western Europe, where research and data are scant and harms may be more severe. This is a preview of subscription content, access via your i...

Public Intellectuals And Journalists Frequently Make Sweeping Claims About The

Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence. Here we identify three common misperceptions: that average exposure to problematic content is high, that algorithms are largely responsible for this exposure and that social media is a primary cause of broader ...

关于网络错误信息和社交媒体的争议拉开了公共话语和科学研究之间的鸿沟。公共知识分子和记者经常对网上虚假内容的影响提出笼统的说法,这些说法与当前的许多经验证据不一致。在这里,我们确定了三种常见的误解:问题内容的平均曝光率很高,算法是造成这种曝光的主要原因,社交媒体是两极分化等更广泛的社会问题的主要原因。在我们对在线错误信息的行为科学研究的回顾中,我们记录了一种低接触虚假和煽动性内容的模式,这些内容集中在有强烈动机寻找此类信息的狭窄边缘。作为回应,我们建议让平台承担责任,以促进在发行尾部暴露虚假和极端内容,因为这些内容的消费最高,现实世界危害的风险也最大。我们还呼吁提高平台透明度,包括与外部研究人员合作,以更好地评估在线错误信息的影响以及对其采取最有效的应对措施。在美国和西欧以外地区采取这些措施尤其重要,因为这些地方的研究和数据很少,而且危害可能更严重。 Misunderstanding The Harms Of Online Misinformation Budak, Ceren; Nyhan,

关于网络错误信息和社交媒体的争议拉开了公共话语和科学研究之间的鸿沟。公共知识分子和记者经常对网上虚假内容的影响提出笼统的说法,这些说法与当前的许多经验证据不一致。在这里,我们确定了三种常见的误解:问题内容的平均曝光率很高,算法是造成这种曝光的主要原因,社交媒体是两极分化等更广泛的社会问题的主要原因。在我们对在线错误信息的行为科学研究的回顾中,我们记录了一种低接触虚假和煽动性内容的模式,这些内容集中在有强烈动机寻找此类信息的狭窄边缘。作为回应,我们建议让平台承担责任,以促进在发行尾部暴露虚假和极端内容,因为这些内容的消费最高,现实世界危害的风险也最大。我们还呼吁提高平台透明度,包括与外部研究人员合作,以更好地评估在线错误信息的影响以及对其采取最有效的应对措施。在美国和西欧以外地区采取这些措施尤其重要,因为这些地方的研究和数据很少,而且危害可能更严重。 Misunderstanding The...

The Controversy Over Online Misinformation And Social Media Has Opened

The controversy over online misinformation and social media has opened a gap between public discourse and scientific research. Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence. Here we identify three common misperceptions: that average exposure to problematic ...