Is government-moderated social media near? - Technical.ly Delaware

Civic

Aug. 14, 2019 3:38 pm

Is government-moderated social media near?

Jennifer Lambe, associate professor of communication at the University of Delaware, talks about the constitutionality of the recently leaked "censorship executive order."

How constitutional would FCC moderation of social media be?

(Photo byadiruch na chiangmai via stock.adobe.com)

This one sort of got lost in the perpetually roiling news cycle: On Aug. 7, a month after the White House held a Social Media Summit specifically for online supporters of the current administration, Politco broke the news that the White House was drafting an executive order “that would address allegations of anti-conservative bias by social media.”

A couple of days later, CNN confirmed it after obtaining a summary of the draft. Since CNN tweeted the story on Aug. 9, it’s gotten 135 likes and 88 comments — and on social media, that’s eerily silent.

The title of the executive order draft is “Protecting Americans from Online Censorship.” According to Jennifer Lambe, associate professor of communication at the University of Delaware and an expert on the First Amendment, free speech and free press, allowing partisan politics to influence content moderation is entirely unconstitutional.

To understand how the internet is regulated by the government, you have to go back to 1990s.

“When the internet was first taking off as a public thing, the rule was that internet service providers and platforms wouldn’t be held legally responsible for content put on their services unless and until they started monitoring the content,” Lambe told Technical.ly. “But what that turned into was that anything could be put up by anybody and the service providers weren’t going to do anything about it, because if they did they would face legal responsibility for what was on their sites.”

Advertisement

And so the landmark Section 230 of the Communications Decency Act was passed in 1996.

“Section 230 basically says as long as they’re acting in good faith about their content moderation policy, then even if they are moderating content, they can’t be held legally responsible if they missed something, they can’t be held legally responsible for removing something,” Lambe said. “Basically it involves the concerns they had about getting involved in content moderation.”

To be sure, Section 230 has been criticized over the years across the political spectrum. The activist group Change the Terms, for example, calls for social media platforms to take responsibility for hosting extremist content. In 2018, it was amended by the bipartisan Stop Enabling Sex Traffickers Act.

“What the White House is saying now is that they want the Federal Communications Commission to issue some opinions and regulations about what ‘acting in good faith’ means,” Lambe said. “So it would basically give the FCC power to get involved in the content moderation decision.”

That’s problematic for a few reasons, though, she said: one, that the government would be involved in content moderation decisions at all — “which seems against the First Amendment entirely” — but also, of course, the FCC is controlled by the White House, which itself is partisan. There’s potential for bias in its rule making.

“There’s no evidence, other than anecdotal, that there is an anti-conservative bias on social media,” Lambe said. “So before there would be any reason to take any action like this, I would think that you would want more facts about the content moderation decisions that are made.

So, what would happen if this executive order is issued in its current state?

“I think it would be challenged immediately and I think it would be struck down in the court,” Lambe said. “The only way that I think something like this could get passed is if there’s congressional action, and even that would be subject to judicial review and would have a hard time getting passed.

“I do think that there are some legitimate concerns about the role that these platforms have in shaping public debate,” she adds. “In particular I think that if there’s a bias it’s towards wanting to keep people on their platforms. So they tend to show people things they already agree with that evoke emotions. Their goal is to make money. I think that there are things that can be done that could be useful. But this is not it.”

-30-
JOIN THE COMMUNITY, BECOME A MEMBER
Already a member? Sign in here
Connect with companies from the Technical.ly community
New call-to-action

Advertisement

Instagram roundup: Zap World Champs of Skimboarding returns to Dewey

This podcast wants to ease the stress of Delaware’s lawyers

A baker’s dozen of Delaware eateries nailing it on Instagram

SPONSORED

Delaware

Verizon is looking for the brightest ideas on how to use its 5G technology

New Castle

Wilmington University

Web Developer (Full-Time)

Apply Now
Philadelphia OR Baltimore

Technically Media

Technical.ly Editorial Intern (Fall 2019)

Apply Now
Delaware

Young Conaway

Entry-Level Corporate Legal Administrative Assistant

Apply Now

Can selfies make STEM more inclusive?

How does Delaware Twitter feel about #Biden2020?

It’s Earth Day — #delagram style

SPONSORED

Delaware

Escape the August heat with cool AI tech

Delaware

Tech Solutions Inc

NOC (Network Operations Center) Technician

Apply Now
Wilmington

BlackRock

Analyst, Recruiting Coordinator

Apply Now
Malvern, PA

Vanguard

Senior Front End Engineer – CX Journey Lab

Apply Now

Sign-up for daily news updates from Technical.ly Delaware

Do NOT follow this link or you will be banned from the site!