Civic News
Legal / Policies / Roundups / Social media

Is government-moderated social media near?

Jennifer Lambe, associate professor of communication at the University of Delaware, talks about the constitutionality of the recently leaked "censorship executive order."

How constitutional would FCC moderation of social media be? (Photo byadiruch na chiangmai via stock.adobe.com)

This one sort of got lost in the perpetually roiling news cycle: On Aug. 7, a month after the White House held a Social Media Summit specifically for online supporters of the current administration, Politco broke the news that the White House was drafting an executive order “that would address allegations of anti-conservative bias by social media.”

A couple of days later, CNN confirmed it after obtaining a summary of the draft. Since CNN tweeted the story on Aug. 9, it’s gotten 135 likes and 88 comments — and on social media, that’s eerily silent.

The title of the executive order draft is “Protecting Americans from Online Censorship.” According to Jennifer Lambe, associate professor of communication at the University of Delaware and an expert on the First Amendment, free speech and free press, allowing partisan politics to influence content moderation is entirely unconstitutional.

To understand how the internet is regulated by the government, you have to go back to 1990s.

“When the internet was first taking off as a public thing, the rule was that internet service providers and platforms wouldn’t be held legally responsible for content put on their services unless and until they started monitoring the content,” Lambe told Technical.ly. “But what that turned into was that anything could be put up by anybody and the service providers weren’t going to do anything about it, because if they did they would face legal responsibility for what was on their sites.”

And so the landmark Section 230 of the Communications Decency Act was passed in 1996.

“Section 230 basically says as long as they’re acting in good faith about their content moderation policy, then even if they are moderating content, they can’t be held legally responsible if they missed something, they can’t be held legally responsible for removing something,” Lambe said. “Basically it involves the concerns they had about getting involved in content moderation.”

To be sure, Section 230 has been criticized over the years across the political spectrum. The activist group Change the Terms, for example, calls for social media platforms to take responsibility for hosting extremist content. In 2018, it was amended by the bipartisan Stop Enabling Sex Traffickers Act.

“What the White House is saying now is that they want the Federal Communications Commission to issue some opinions and regulations about what ‘acting in good faith’ means,” Lambe said. “So it would basically give the FCC power to get involved in the content moderation decision.”

That’s problematic for a few reasons, though, she said: one, that the government would be involved in content moderation decisions at all — “which seems against the First Amendment entirely” — but also, of course, the FCC is controlled by the White House, which itself is partisan. There’s potential for bias in its rule making.

“There’s no evidence, other than anecdotal, that there is an anti-conservative bias on social media,” Lambe said. “So before there would be any reason to take any action like this, I would think that you would want more facts about the content moderation decisions that are made.

So, what would happen if this executive order is issued in its current state?

“I think it would be challenged immediately and I think it would be struck down in the court,” Lambe said. “The only way that I think something like this could get passed is if there’s congressional action, and even that would be subject to judicial review and would have a hard time getting passed.

“I do think that there are some legitimate concerns about the role that these platforms have in shaping public debate,” she adds. “In particular I think that if there’s a bias it’s towards wanting to keep people on their platforms. So they tend to show people things they already agree with that evoke emotions. Their goal is to make money. I think that there are things that can be done that could be useful. But this is not it.”

Companies: FCC / University of Delaware
Engagement

Join the conversation!

Find news, events, jobs and people who share your interests on Technical.ly's open community Slack

Trending

How venture capital is changing, and why it matters

What company leaders need to know about the CTA and required reporting

Why the DOJ chose New Jersey for the Apple antitrust lawsuit

A Delaware guide to the 2024 solar eclipse

Technically Media