Civic News
Philly Tech Week / Politics / Social media / Technical.ly Developers Conference

On misinformation and the 2020 election, ‘the call is coming from inside the house’

At the Technical.ly Developers Conference, The New York Times opinion writer-at-large Charlie Warzel talked about how misinformation spreading on social media is manipulating America's polarization and 2020's cascading crises.

Technical.ly's Paige Gross and New York Times' Charlie Warzel. (Screenshot)
Imagine this: It’s getting late on Election Night on Nov. 3, 2020, and the networks are saying the presidential race is still too close to call.

Yet President Donald Trump sees himself in the lead, as millions of mail-in ballots remain to be counted in the days ahead. So he decides to declare victory by turning directly to his followers on social media. And, in Trumpian fashion, he tells his followers that the media and Democrats are going to try to steal the election, even though models show that the mail-in ballots will push Biden over the top.

This “red mirage” scenario is the stuff of nightmares for Democratic strategists. Yet as The New York Times‘ opinion writer-at-large covering technology Charlie Warzel said Wednesday during a keynote at Technical.ly’s Developers Conference, it’s also important to consider the response of the company leaders who run the platforms on which Trump would deliver this message.

Social media companies, under fire after the rash of fake news that spread in 2016, might be moved to take action in such a case. They might leave it up, but slap it with a censored label. Or, Warzel laid out this hypothetical scenario: Say Facebook decided to delete the message, but that only unleashes more Republican lawmakers to come out with their own posts bolstering Trump’s claim. By then, the message has spread via Facebook, even if the initial post is gone.

“It becomes not only newsworthy but part of the party line,” Warzel told Technical.ly Philly lead reporter Paige Gross at the virtual event, which was part of Philly Tech Week presented by Comcast.

Whether they’d take those actions or others, the fact that the decision lies with Facebook gets at the platforms’ power. As Warzel wrote in a recent column, the company’s CEO Mark Zuckerberg has become the “most powerful unelected man in America,” with a power to shape democracy that should lie with the many.

“This moment is not about being seen as biased or not biased,” Warzel said. Rather, it’s the “major stress test” of the rules and values at the heart of the platforms, and “we can’t just leave it in the hands of one or two people.”

It’s tempting to think there’s some precedent here. After all, the fact that this alarm is being sounded now, before the election, is surely owed in some part to much learning and studying over the last four years to understand the role of misinformation in the 2016 election. With examination of how social media campaigns traced to Russia fomented division in a way that the U.S. intelligence community believes was designed to tip the 2016 election toward Trump, there’s been lots of analysis shedding light on how the platforms’ models were leveraged to make this happen. But that learning is still happening, and Russia is coming again.

Yet it’s also true that there are new challenges to confront. No one had ever heard of QAnon in 2016, and that’s a homegrown conspiracy theory that is being embraced by a Congressional candidate. Warzel detailed how misinformation has become more sophisticated since 2016, and found its way into all facets of America’s public life.

Take 2020, for example.

There’s a global pandemic, an economic recession, a generational reckoning over race and policing, wildfires in the West, and now a consequential presidential election. With these overlapping crises, misinformation has taken kernels of truth to exploit false claims. Warzel pointed to the debunked and false claim that antifa supporters were starting the wildfires.

And at this point, lots of the false content that spreads isn’t necessarily 100% “fake news” that became a buzzword in 2016.

“It’s more kernels of truth,” Warzel said, which are packaged in a way that carefully manipulates the facts and are spun toward a particular narrative. That misinformation enters into a cauldron of polarization that doesn’t need a foreign power and an election to light the match.

“The real issue in my opinion is that its sophistication over the last four years has allowed people to become better vectors for … false news … than Russia could ever hope to be,” Warzel said.

So while we’ve learned from 2016, it’s a different environment now, and there are more variables. It leaves Warzel adamant that Election Night is truly uncertain, and scenario planning is underway for many.

After a surprise in 2016, surely no one wants to predict an outcome now. But it also speaks to the challenge when the source of misinformation that could destabilize democracy is a nation’s own people — and the very people who are supposed to shape and uphold it.

The call is coming from inside the house in this election,” Warzel said.

Companies: Facebook / New York Times
Series: Civic Tech Month 2020
Engagement

Join the conversation!

Find news, events, jobs and people who share your interests on Technical.ly's open community Slack

Trending

Philly daily roundup: Women's health startup wins pitch; $204M for internet access; 'GamingWalls' for sports venues

Delaware daily roundup: Delmarva Power vendor stats; DelDOT's $15M federal grant; 50 best companies to work for

DC daily roundup: Inside UMCP's new ethical AI project; HBCU founder excellence; a big VC shutters MoCo office

Delaware daily roundup: Over 4,000 Black-owned businesses uncovered; Dover makes rising cities list; a push for online sports betting

Technically Media