Issue 112: Odds and Ends in Social Media Research
Social media saturates nearly every facet of our lives, and understanding its effects on society has never been more critical. This week's DLTJ Thursday Threads delves into recent studies and discussions of why misinformation is spread on platforms and ways to counteract it. As platforms continue to shape the way we communicate and process information, they also spark moral outrage and other intense emotions that can lead to the further spread of false content. Researchers are exploring how these dynamics unfold, as well as the roles of opportunists who exploit these platforms for personal or political gain.
As we navigate these challenges, there are things that individuals can do and things that we could expect platforms to do to reduce the impact of misinformation. While individuals can adopt practices to avoid contributing to misinformation, there is also a call for platforms to refine their moderation strategies, such as combining fact-checking with community-driven initiatives. Amidst these discussions, the potential impact of social media on adolescent wellbeing remains a concern, with experts debating its true role in rising mental health issues among young adults.
- Did you really read that article? Moral Outrage fuels the spread of misinformation online.
- Maybe that outrageous article wasn't pushed to you because of moral outrage. It could be opportunists exploiting online conspiracy theories for influence and profit.
- We can clean up social media from the ground up: strategies to avoid becoming a 'misinformation superspreader' on social media.
- For a more top-down approach, we could insist that platforms combine fact-checking and community notes for better social media content moderation.
- On the other hand, research showed that the community notes system fails to curb misinformation on social media.
- Exploring the complex impact of social media on teen mental health.
- This Week I Learned: most plastic in the ocean isn't from littering, and recycling will not save us.
- This week's cat
Also on DLTJ this past week:
Feel free to send this newsletter to others you think might be interested in the topics. If you are not already subscribed to DLTJ's Thursday Threads, visit the sign-up page. If you would like a more raw and immediate version of these types of stories, follow me on Mastodon where I post the bookmarks I save. Comments and tips, as always, are welcome.
Moral Outrage Fuels Spread of Misinformation Online
“The vast majority of misinformation studies assume people want to be accurate, but certain things distract them,” says William J. Brady, a researcher at Northwestern University. “Maybe it’s the social media environment. Maybe they’re not understanding the news, or the sources are confusing them. But what we found is that when content evokes outrage, people are consistently sharing it without even clicking into the article.” Brady co-authored a study on how misinformation exploits outrage to spread online. When we get outraged, the study suggests, we simply care way less if what’s got us outraged is even real.
The article discusses the phenomenon of misinformation spreading online, particularly when it evokes "moral outrage." It starts with a fabricated quote attributed to Rob Bauer, chair of a NATO military committee, suggesting that NATO should preemptively strike Russia, which garnered significant attention despite being untrue. The misinformation received nearly 250,000 views on social media, amplified by figures like Alex Jones. This research challenges the common assumption that misinformation is primarily shared by mistake; rather, it suggests that outrage drives people to share content without verifying its accuracy. Because of that, traditional solutions aimed at promoting accuracy in sharing have proven ineffective.
Opportunists Exploit Online Conspiracy Theories for Influence and Profit
There has been a lot of research on the types of people who believe conspiracy theories, and their reasons for doing so. But there’s a wrinkle: My colleagues and I have found that there are a number of people sharing conspiracies online who don’t believe their own content. They are opportunists. These people share conspiracy theories to promote conflict, cause chaos, recruit and radicalize potential followers, make money, harass, or even just to get attention.
The article explores the phenomenon of individuals who share conspiracy theories online without genuinely believing in them. The study identified various types of conspiracy-spreaders, including extremist groups that use conspiracies as a gateway for radicalization, governments that manipulate narratives for political gain, and others who do it for profit or to gain influence. Many everyday users even share conspiracies for attention or social validation, often without verifying the information. The article warns that these opportunists can eventually convince themselves of their own lies.
Strategies to Avoid Becoming a 'Misinformation Superspreader' on Social Media
Emerging psychology research has revealed some tactics that can help protect our society from misinformation. Here are seven strategies you can use to avoid being misled, and to prevent yourself – and others – from spreading inaccuracies.
Whether your social media feed is filled with content driven by moral outrage, attempts to influence you, or schemes to profit from you, there are ways that you can break the cycle of spreading this misinformation. The strategies are: educate yourself about common disinformation tactics, recognize your own vulnerabilities and biases, carefully evaluate the credibility of information sources, pause before sharing content, be aware of how emotions can influence the spread of misinformation, gently challenge the misinformation you see, and stand with others when you see someone else challenge it. The article emphasizes that while there is no perfect solution, these steps can help protect individuals and their social networks from the harmful effects of false and misleading information.
Combine Fact-Checking and Community Notes for Better Social Media Content Moderation
Even before Meta’s announcement that it was ending fact-checking in favor of a Twitter-style community notes approach or Musk’s tweet in relation to the war in Ukraine, researchers at CITP were looking at the pluses and minuses of both systems.... [Princeton Computer Science Ph.D student Madelyne] Xiao thinks that the political turmoil surrounding Meta’s announcement hardened people into two camps. The debate has “…unhelpfully positioned (human) fact-checking as a content moderation strategy contra notes-like systems.” They should be used in conjunction: “Neither system is flawless! Both have much room for improvement!”
This is just a brief article from the Center for Information Technology Policy, but one to go to for links to other research and researchers. It was published around the same time as the next article that highlights deficiencies in the community-notes approach. Researchers from CITP suggest that both fact-checking and community notes can be complementary rather than mutually exclusive. Ultimately, the discussion reflects a need for a more nuanced understanding of these moderation strategies in combating disinformation effectively.
Community Notes System Fails to Curb Misinformation on Social Media
The billionaire leaders of social media giants have long been under pressure to quell the spread of mis- and disinformation. No system to date, from human fact-checkers to automation, has satisfied critics on the left or the right.
One novel approach winning plaudits recently has been Community Notes. The crowdsourced method, first introduced by Twitter before Elon Musk acquired it and rebranded it as X, allows regular users to submit additional context to posts, offering up supporting evidence to set the record straight.... The system has advantages over the alternatives, but its limits as an antidote to misinformation are clear. So are its benefits for executives who have been dogged by intense scrutiny over misinformation and censorship for the better part of a decade. It allows them to outsource responsibility for what happens on their platforms to their users. And also the blame.
This article is marked as an opinion piece in Bloomberg, but it is well researched with lots of hyperlinks to source material. It discusses the limitations of the Community Notes system implemented by social media platforms like Ex-Twitter and Meta to combat misinformation. Community Notes allows users to provide context and evidence to posts, promoting a sense of democratized information sharing. However, the article argues that this approach falls short of addressing the deeper issues inherent in social media's structure. The authors suggest that without significant changes to how these platforms operate, even innovative systems like Community Notes cannot resolve the ongoing challenges of misinformation. Ultimately, the piece highlights the struggle between user-generated content and the need for reliable information in digital spaces.
Exploring the Complex Impact of Social Media on Teen Mental Health
Two things need to be said after reading The Anxious Generation. First, this book is going to sell a lot of copies, because Jonathan Haidt is telling a scary story about children’s development that many parents are primed to believe. Second, the book’s repeated suggestion that digital technologies are rewiring our children’s brains and causing an epidemic of mental illness is not supported by science. Worse, the bold proposal that social media is to blame might distract us from effectively responding to the real causes of the current mental-health crisis in young people.
This is a review of The Anxious Generation: How the Great Rewiring of Childhood is Causing an Epidemic of Mental Illness, which was published a year ago. The article suggests that while social media can foster connections, it often fails to provide genuine social interaction, potentially leading to feelings of isolation and mental distress. Various studies are cited to examine the relationship between social media usage and mental health outcomes, indicating that the effects are complex and multifaceted. The discussion also addresses the need for further research to understand the nuances of this relationship.
This Week I Learned: Most plastic in the ocean isn't from littering, and recycling will not save us
Littering is responsible for a very small percentage of the overall plastic in the environment. Based on this graph from the OECD, you can see littering is this teeny-tiny blue bar here, and mismanaged waste, not including littering, is this massive one at the bottom. Mismanaged waste includes all the things that end up either in illegal dump sites or burned in the open or in the rivers or oceans or wherever. The focus on littering specifically, it's an easy answer because obviously there's nothing wrong with discouraging people from littering, but it focuses on individual people's bad choices rather than systemic forces that are basically flushing plastic into the ocean every minute. Mismanaged waste includes everything that escapes formal waste systems. So they might end up dumped, they might end up burned, they might end up in the environment.
Contrary to popular belief, most plastic in the Great Pacific Garbage Patch stems from the fishing industry, with only a small fraction linked to consumer waste. The video highlights that mismanaged waste, rather than individual littering, is the primary contributor to plastic pollution, with 82% of macroplastic leakage resulting from this issue. It emphasizes the ineffectiveness of recycling as a solution, noting that less than 10% of plastics are currently recycled, and the industry has perpetuated the myth that recycling can resolve the plastic crisis. Microplastics, which are increasingly recognized as a major problem, originate from various sources, including tires and paint, with new data suggesting that paint is a significant contributor. The video stresses the need for systemic changes, including reducing plastic production and simplifying the chemicals used in plastics, as current efforts and pledges from corporations and governments have not effectively curbed plastic pollution.
What did you learn this week? Let me know on Mastodon or Bluesky.