Civic News

Social media can in fact be made better: Research shows it is possible to reward users for sharing accurate information instead of misinformation

A trio of researchers explored if it's possible to incentivize sharing reliable content. Their research, they write, offers key insights.

We feel rewarded by reactions to information we share, and that can lead to good and bad habits. (Linka A Odom/DigitalVision via Getty Images)

Is social media designed to reward people for acting badly?

The answer is clearly yes, given that the reward structure on social media platforms relies on popularity, as indicated by the number of responses — likes and comments — a post receives from other users. Black-box algorithms then further amplify the spread of posts that have attracted attention.

Sharing widely read content, by itself, isn’t a problem. But it becomes a problem when attention-getting, controversial content is prioritized by design. Given the design of social media sites, users form habits to automatically share the most engaging information regardless of its accuracy and potential harm. Offensive statements, attacks on out groups and false news are amplified, and misinformation often spreads further and faster than the truth.

We are two social psychologists and a marketing scholar. Our research, presented at the 2023 Nobel Prize Summit, shows that social media actually has the ability to create user habits to share high-quality content. After a few tweaks to the reward structure of social media platforms, users begin to share information that is accurate and fact-based.

The problem with habit-driven misinformation-sharing is significant. Facebook’s own research shows that being able to share already shared content with a single click drives misinformation. Thirty-eight percent of views of text misinformation and 65% of views of photographic misinformation come from content that has been reshared twice, meaning a share of a share of a share of an original post. The biggest sources of misinformation, such as Steve Bannon’s War Room, exploit social media’s popularity optimization to promote controversy and misinformation beyond their immediate audience.

Re-targeting rewards

To investigate the effect of a new reward structure, we gave financial rewards to some users for sharing accurate content and not sharing misinformation. These financial rewards simulated the positive social feedback, such as likes, that users typically receive when they share content on platforms. In essence, we created a new reward structure based on accuracy instead of attention.

As on popular social media platforms, participants in our research learned what got rewarded by sharing information and observing the outcome, without being explicitly informed of the rewards beforehand. This means that the intervention did not change the users’ goals, just their online experiences. After the change in reward structure, participants shared significantly more content that was accurate. More remarkably, users continued to share accurate content even after we removed rewards for accuracy in a subsequent round of testing. These results show that users can be given incentives to share accurate information as a matter of habit.

A different group of users received rewards for sharing misinformation and for not sharing accurate content. Surprisingly, their sharing most resembled that of users who shared news as they normally would, without any financial reward. The striking similarity between these groups reveals that social media platforms encourage users to share attention-getting content that engages others at the expense of accuracy and safety.

Engagement and the bottom line

Maintaining high levels of user engagement is crucial for the financial model of social media platforms. Attention-getting content keeps users active on the platforms. This activity provides social media companies with valuable user data for their primary revenue source: targeted advertising.

In practice, social media companies might be concerned that changing user habits could reduce users’ engagement with their platforms. However, our experiments demonstrate that modifying users’ rewards does not reduce overall sharing. Thus, social media companies can build habits to share accurate content without compromising their user base.

Platforms that give incentives for spreading accurate content can foster trust and maintain or potentially increase engagement with social media. In our studies, users expressed concerns about the prevalence of fake content, leading some to reduce their sharing on social platforms. An accuracy-based reward structure could help restore waning user confidence.

Doing right and doing well

Our approach, using the existing rewards on social media to create incentives for accuracy, tackles misinformation spread without significantly disrupting the sites’ business model. This has the additional advantage of altering rewards instead of introducing content restrictions, which are often controversial and costly in financial and human terms.

Implementing our proposed reward system for news sharing carries minimal costs and can be easily integrated into existing platforms. The key idea is to provide users with rewards in the form of social recognition when they share accurate news content. This can be achieved by introducing response buttons to indicate trust and accuracy. By incorporating social recognition for accurate content, algorithms that amplify popular content can leverage crowdsourcing to identify and amplify truthful information.

Both sides of the political aisle now agree that social media has challenges, and our data pinpoints the root of the problem: the design of social media platforms.The Conversation

This is a guest post by Ian Anderson, a Ph.D. student in social psychology at the University of Southern California's (USC) Dornsife College of Letters, Arts and Sciences; Gizem Ceylan, a postdoctoral research associate at Yale University's School of Management; and Wendy Wood, a provost professor emeritus of psychology and business at USC Dornsife. A version of this article is republished from The Conversation via a Creative Commons license.

Before you go...

Please consider supporting Technical.ly to keep our independent journalism strong. Unlike most business-focused media outlets, we don’t have a paywall. Instead, we count on your personal and organizational support.

3 ways to support our work:
  • Contribute to the Journalism Fund. Charitable giving ensures our information remains free and accessible for residents to discover workforce programs and entrepreneurship pathways. This includes philanthropic grants and individual tax-deductible donations from readers like you.
  • Use our Preferred Partners. Our directory of vetted providers offers high-quality recommendations for services our readers need, and each referral supports our journalism.
  • Use our services. If you need entrepreneurs and tech leaders to buy your services, are seeking technologists to hire or want more professionals to know about your ecosystem, Technical.ly has the biggest and most engaged audience in the mid-Atlantic. We help companies tell their stories and answer big questions to meet and serve our community.
The journalism fund Preferred partners Our services
Engagement

Join our growing Slack community

Join 5,000 tech professionals and entrepreneurs in our community Slack today!

Trending

Silicon Valley venture firm launches ‘Rising America’ fund to back diverse founders

Pittsburgh’s innovation ecosystem is surfing a wave of momentum

Why are there so few tech apprenticeships?

Philly’s RealLIST startups are split on the remote versus hybrid work debate

Technically Media