Social Media Algorithms Warp How People Learn From Each Other Research
Assistant Professor of Management and Organizations, Northwestern University William Brady does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment. People’s daily interactions with online algorithms affect how they learn from others, with negative consequences including social misperceptions, conflict and the spread of misinformation, my colleagues and I have found. People are increasingly interacting with others in social media environments where algorithms control the flow of social information they see. Algorithms determine in part which messages, which people and which ideas social media users see. On social media platforms, algorithms are mainly designed to amplify information that sustains engagement, meaning they keep people clicking on content and coming back to the platforms.
I’m a social psychologist, and my colleagues and I have found evidence suggesting that a side effect of this design is that algorithms amplify information people are strongly biased to learn from. We call this information “PRIME,” for prestigious, in-group, moral and emotional information. Social Media Algorithms Warp How People Learn from Each Other Social media companies’ drive to keep you on their platforms clashes with how people evolved to learn from each other Social media pushes evolutionary buttons. The following essay is reprinted with permission from The Conversation, an online publication covering the latest research.
People’s daily interactions with online algorithms affect how they learn from others, with negative consequences including social misperceptions, conflict and the spread of misinformation, my colleagues and I have found. In prehistoric societies, humans tended to learn from members of our ingroup or from more prestigious individuals, as this information was more likely to be reliable and result in group success. However, with the advent of diverse and complex modern communities -- and especially in social media -- these biases become less effective. For example, a person we are connected to online might not necessarily be trustworthy, and people can easily feign prestige on social media. In a review published in the journal Trends in Cognitive Science on August 3rd, a group of social scientists describe how the functions of social media algorithms are misaligned with human social instincts meant... "Several user surveys now both on Twitter and Facebook suggest most users are exhausted by the political content they see.
A lot of users are unhappy, and there's a lot of reputational components that Twitter and Facebook must face when it comes to elections and the spread of misinformation," says first author William Brady,... "We wanted to put out a systematic review that's trying to help understand how human psychology and algorithms interact in ways that can have these consequences," says Brady. "One of the things that this review brings to the table is a social learning perspective. As social psychologists, we're constantly studying how we can learn from others. This framework is fundamentally important if we want to understand how algorithms influence our social interactions." Humans are biased to learn from others in a way that typically promotes cooperation and collective problem-solving, which is why they tend to learn more from individuals they perceive as a part of their...
In addition, when learning biases were first evolving, morally and emotionally charged information was important to prioritize, as this information would be more likely to be relevant to enforcing group norms and ensuring collective... In contrast, algorithms are usually selecting information that boosts user engagement in order to increase advertising revenue. This means algorithms amplify the very information humans are biased to learn from, and they can oversaturate social media feeds with what the researchers call Prestigious, Ingroup, Moral, and Emotional (PRIME) information, regardless of... As a result, extreme political content or controversial topics are more likely to be amplified, and if users are not exposed to outside opinions, they might find themselves with a false understanding of the... A web-based method was shown to mitigate political polarization on X by nudging antidemocratic and extremely negative partisan posts lower in a user’s feed. The tool, which is independent of the platform, has the potential to give users more say over what they see on social media.iStock
A new tool shows it is possible to turn down the partisan rancor in an X feed — without removing political posts and without the direct cooperation of the platform. The study, from researchers at the University of Washington, Stanford University and Northeastern University, also indicates that it may one day be possible to let users take control of their social media algorithms. The researchers created a seamless, web-based tool that reorders content to move posts lower in a user’s feed when they contain antidemocratic attitudes and partisan animosity, such as advocating for violence or jailing supporters... Researchers published their findings Nov. 27 in Science.
People Also Search
- Social media algorithms warp how people learn from each other, research ...
- Social Media Algorithms Warp How People Learn from Each Other
- PDF Social media algorithms warp how people learn from one another ...
- Social media algorithms warp how people learn - ProQuest
- Social media algorithms exploit how humans learn from their peers
- Social media research tool can reduce polarization — it could also lead ...
- Platform-independent experiments on social media | Science
Assistant Professor Of Management And Organizations, Northwestern University William Brady
Assistant Professor of Management and Organizations, Northwestern University William Brady does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment. People’s daily interactions with online algorithms affect how they learn from others, with negative...
I’m A Social Psychologist, And My Colleagues And I Have
I’m a social psychologist, and my colleagues and I have found evidence suggesting that a side effect of this design is that algorithms amplify information people are strongly biased to learn from. We call this information “PRIME,” for prestigious, in-group, moral and emotional information. Social Media Algorithms Warp How People Learn from Each Other Social media companies’ drive to keep you on th...
People’s Daily Interactions With Online Algorithms Affect How They Learn
People’s daily interactions with online algorithms affect how they learn from others, with negative consequences including social misperceptions, conflict and the spread of misinformation, my colleagues and I have found. In prehistoric societies, humans tended to learn from members of our ingroup or from more prestigious individuals, as this information was more likely to be reliable and result in...
A Lot Of Users Are Unhappy, And There's A Lot
A lot of users are unhappy, and there's a lot of reputational components that Twitter and Facebook must face when it comes to elections and the spread of misinformation," says first author William Brady,... "We wanted to put out a systematic review that's trying to help understand how human psychology and algorithms interact in ways that can have these consequences," says Brady. "One of the things...
In Addition, When Learning Biases Were First Evolving, Morally And
In addition, when learning biases were first evolving, morally and emotionally charged information was important to prioritize, as this information would be more likely to be relevant to enforcing group norms and ensuring collective... In contrast, algorithms are usually selecting information that boosts user engagement in order to increase advertising revenue. This means algorithms amplify the ve...