Misunderstanding The Harms Of Online Misinformation Pdf Social
Nature volume 630, pages 45–53 (2024)Cite this article The controversy over online misinformation and social media has opened a gap between public discourse and scientific research. Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence. Here we identify three common misperceptions: that average exposure to problematic content is high, that algorithms are largely responsible for this exposure and that social media is a primary cause of broader social problems... In our review of behavioural science research on online misinformation, we document a pattern of low exposure to false and inflammatory content that is concentrated among a narrow fringe with strong motivations to seek... In response, we recommend holding platforms accountable for facilitating exposure to false and extreme content in the tails of the distribution, where consumption is highest and the risk of real-world harm is greatest.
We also call for increased platform transparency, including collaborations with outside researchers, to better evaluate the effects of online misinformation and the most effective responses to it. Taking these steps is especially important outside the USA and Western Europe, where research and data are scant and harms may be more severe. This is a preview of subscription content, access via your institution Access Nature and 54 other Nature Portfolio journals Get Nature+, our best-value online-access subscription In 2006, Facebook launched its News Feed feature, sparking seemingly endless contentious public discourse on the power of the “social media algorithm” in shaping what people see online.
Nearly two decades and many recommendation algorithm tweaks later, this discourse continues, now laser-focused on whether social media recommendation algorithms are primarily responsible for exposure to online misinformation and extremist content. Researchers at the Computational Social Science Lab (CSSLab) at the University of Pennsylvania, led by Duncan Watts, Stevens University Professor in Computer and Information Science (CIS), study Americans’ news consumption. In a new article in Nature, Watts, along with David Rothschild of Microsoft Research (Wharton Ph.D. ‘11 and PI in the CSSLab), Ceren Budak of the University of Michigan, Brendan Nyhan of Dartmouth College and Annenberg alumnus Emily Thorson (Ph.D. ’13) of Syracuse University, review years of behavioral science research on exposure to false and radical content online and find that exposure to harmful and false information on social media is minimal to all... “The research shows that only a small fraction of people are exposed to false and radical content online,” says Rothschild, “and that it’s personal preferences, not algorithms that lead people to this content.
The people who are exposed to false and radical content are those who seek it out.”
People Also Search
- PDF Misunderstanding the harms of online misinformation
- Misunderstanding The Harms of Online Misinformation | PDF | Social ...
- Misunderstanding the harms of online misinformation - Nature
- Misunderstanding the Harms of Online Misinformation
- Social media and the spread of misinformation: infectious and a threat ...
- PDF Misunderstanding Misinformation
- PDF A Model of Online Misinformation - economics.mit.edu
- PDF Misinformation in Social Media: Definition, Manipulation, and Detecti
Nature Volume 630, Pages 45–53 (2024)Cite This Article The Controversy
Nature volume 630, pages 45–53 (2024)Cite this article The controversy over online misinformation and social media has opened a gap between public discourse and scientific research. Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence. Here we identify three commo...
We Also Call For Increased Platform Transparency, Including Collaborations With
We also call for increased platform transparency, including collaborations with outside researchers, to better evaluate the effects of online misinformation and the most effective responses to it. Taking these steps is especially important outside the USA and Western Europe, where research and data are scant and harms may be more severe. This is a preview of subscription content, access via your i...
Nearly Two Decades And Many Recommendation Algorithm Tweaks Later, This
Nearly two decades and many recommendation algorithm tweaks later, this discourse continues, now laser-focused on whether social media recommendation algorithms are primarily responsible for exposure to online misinformation and extremist content. Researchers at the Computational Social Science Lab (CSSLab) at the University of Pennsylvania, led by Duncan Watts, Stevens University Professor in Com...
The People Who Are Exposed To False And Radical Content
The people who are exposed to false and radical content are those who seek it out.”