Civic News
Big Tech / Internet / Social media / Technology

How to fight disinformation online? Get ahead of the lie before it gets out of control

Beth Schwanke, executive director of the University of Pittsburgh Institute for Cyber Law, Policy and Security, says a systemic approach is needed.

See you online. (Photo by Pexels user Ocko Geserick, used via a Creative Commons license)

This editorial article is a part of Big Tech + You Month 2023 in Technical.ly’s editorial calendar.

From COVID-19 denial to election result conspiracy theories, a lie can make its way around the internet with just a few clicks. But what role do large social media websites play in disinformation being shared? And what could they be doing in the future to prevent damaging myths from being taken as fact?

Beth Schwanke, executive director of the University of Pittsburgh Institute for Cyber Law, Policy and Security, told Technical.ly that because of the impact companies like Facebook and Twitter can have, there’s a lot they should do to address the problem.

“They play a pretty big role,” Schwanke said. “And that’s not to say that disinformation is primarily a problem because of or only on social media platforms. But I think because of the speed and velocity and low barriers to entry, … there’s been just this extraordinary explosion of disinformation with very real-life, on-the-ground impacts for people.”

Two-thirds of adults say they get their news from social media platforms nowadays. But in the early aughts when most of these platforms sprang into being, their creators and users likely didn’t anticipate how prevalent fake news and internet hoaxes would become. Now that these problems have come to a head, with sometimes dangerous consequences, Schwanke said these platforms have a responsibility to mitigate the damage — and not only that, but to not profit from the outrage clicks that can come from disinformation being spread.

“At this point in time, it is entirely foreseeable,” Schwanke said. “Platforms need to be responsible about that. And it’s not to say that it is only the platform’s responsibility at all, but not foreseeing it is certainly not an excuse.”

Schwanke isn’t alone in feeling this way. Within the 1996 Communications Decency Act’s language, Section 230 declares: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Yet as the internet has rapidly changed over the past few decades, many have questioned if it’s tenable to continue allowing platforms absolution if and when their users spew hate speech and the like. The act’s future is currently being debated by the US Supreme Court.

We are so far from home free for 2024. I think we should all be concerned and be preparing now.Beth Schwanke Pitt Cyber

Much to the relief of politicos and even the typically apathetic around the country, the 2022 midterms didn’t see the kind of chaos that came from election result denial after the 2020 presidential election. Schwanke attributed that to the prebunking that journalists, elected officials, and organizations like Pitt Cyber did months in advance to stop conspiracy theories before they could gain traction.

“It was harder for there to be a coherent narrative about fraud,” Schwanke said. “Candidates did concede relatively quickly, compared to 2020. Also, President Trump had a diminished online presence. Perhaps that made a difference compared to 2020. But we are so far from home free for 2024. I think we should all be concerned and be preparing now.”

With that in mind, since disinformation being spread online is an ongoing problem, she added that it requires vigilance and ongoing solutions. What can be done? For starters, Schwanke said that employing human moderators, particularly in languages beyond English — since the US isn’t the only country with important elections — would be a step in the right direction. Additionally, Schwanke suggested that platforms privilege reputable news sources to ensure that reliable information is easily accessible.

It also could be useful for platforms — Meta, for instance — to “explore how they can support and train some of those moderators who are doing that work for them,” Schwanke said.

Moreover, since there can be such large-scale consequences for disinformation being spread, Schwanke said that this might be an area where the federal government could be of assistance. (Pew Research Center reports that 48% of Americans agree with her, as of 2021.) Regulation and providing more support for local journalists, she said, could be measures that go a long way.

In conversations about how we can combat the spread of disinformation, media literacy is often presented as an inoculation against conspiracy theories and bad actors. Although Schwanke agrees that media literacy would be a positive thing for many people, it can’t be the only solution because, like any societal problem, it must be tackled systemically instead of individually.

“The thing that I think is concerning about that is that it puts the onus on the individual to fix the problem,” Schwanke said. “It’s a systemic problem and so it requires far more systemic solutions.”

Atiya Irvin-Mitchell is a 2022-2024 corps member for Report for America, an initiative of The Groundtruth Project that pairs young journalists with local newsrooms. This position is supported by the Heinz Endowments.
Companies: University of Pittsburgh
Series: Big Tech + You Month 2023
Engagement

Join the conversation!

Find news, events, jobs and people who share your interests on Technical.ly's open community Slack

Trending

Pittsburgh weekly roundup: 412 Food Rescue's new CEO; Inside Diversitech; Fentanyl sensor innovations

The Affordable Connectivity Program just ended. What’s next for the 763,000 Pennsylvanians who depended on it?

This Week in Jobs: Tech week takes over with a job fair with 8 hiring companies and 18 different conference sessions

Tech leaders should upskill, not hire new, to build expertise at an affordable rate

Technically Media