This one sort of got lost in the perpetually roiling news cycle: On Aug. 7, a month after the White House held a Social Media Summit specifically for online supporters of the current administration, Politco broke the news that the White House was drafting an executive order “that would address allegations of anti-conservative bias by social media.”
A couple of days later, CNN confirmed it after obtaining a summary of the draft. Since CNN tweeted the story on Aug. 9, it’s gotten 135 likes and 88 comments — and on social media, that’s eerily silent.
A draft executive order from the White House could put the FCC in charge of shaping how Facebook, Twitter and other large tech companies curate what appears on their websites, according to multiple people familiar with the matter https://t.co/xMTychzYxr
— CNN (@CNN) August 9, 2019
The title of the executive order draft is “Protecting Americans from Online Censorship.” According to Jennifer Lambe, associate professor of communication at the University of Delaware and an expert on the First Amendment, free speech and free press, allowing partisan politics to influence content moderation is entirely unconstitutional.
To understand how the internet is regulated by the government, you have to go back to 1990s.
“When the internet was first taking off as a public thing, the rule was that internet service providers and platforms wouldn’t be held legally responsible for content put on their services unless and until they started monitoring the content,” Lambe told Technical.ly. “But what that turned into was that anything could be put up by anybody and the service providers weren’t going to do anything about it, because if they did they would face legal responsibility for what was on their sites.”
And so the landmark Section 230 of the Communications Decency Act was passed in 1996.
“Section 230 basically says as long as they’re acting in good faith about their content moderation policy, then even if they are moderating content, they can’t be held legally responsible if they missed something, they can’t be held legally responsible for removing something,” Lambe said. “Basically it involves the concerns they had about getting involved in content moderation.”
To be sure, Section 230 has been criticized over the years across the political spectrum. The activist group Change the Terms, for example, calls for social media platforms to take responsibility for hosting extremist content. In 2018, it was amended by the bipartisan Stop Enabling Sex Traffickers Act.
“What the White House is saying now is that they want the Federal Communications Commission to issue some opinions and regulations about what ‘acting in good faith’ means,” Lambe said. “So it would basically give the FCC power to get involved in the content moderation decision.”
That’s problematic for a few reasons, though, she said: one, that the government would be involved in content moderation decisions at all — “which seems against the First Amendment entirely” — but also, of course, the FCC is controlled by the White House, which itself is partisan. There’s potential for bias in its rule making.
“There’s no evidence, other than anecdotal, that there is an anti-conservative bias on social media,” Lambe said. “So before there would be any reason to take any action like this, I would think that you would want more facts about the content moderation decisions that are made.
So, what would happen if this executive order is issued in its current state?
“I think it would be challenged immediately and I think it would be struck down in the court,” Lambe said. “The only way that I think something like this could get passed is if there’s congressional action, and even that would be subject to judicial review and would have a hard time getting passed.
“I do think that there are some legitimate concerns about the role that these platforms have in shaping public debate,” she adds. “In particular I think that if there’s a bias it’s towards wanting to keep people on their platforms. So they tend to show people things they already agree with that evoke emotions. Their goal is to make money. I think that there are things that can be done that could be useful. But this is not it.”
Before you go...
Please consider supporting Technical.ly to keep our independent journalism strong. Unlike most business-focused media outlets, we don’t have a paywall. Instead, we count on your personal and organizational support.
3 ways to support our work:- Contribute to the Journalism Fund. Charitable giving ensures our information remains free and accessible for residents to discover workforce programs and entrepreneurship pathways. This includes philanthropic grants and individual tax-deductible donations from readers like you.
- Use our Preferred Partners. Our directory of vetted providers offers high-quality recommendations for services our readers need, and each referral supports our journalism.
- Use our services. If you need entrepreneurs and tech leaders to buy your services, are seeking technologists to hire or want more professionals to know about your ecosystem, Technical.ly has the biggest and most engaged audience in the mid-Atlantic. We help companies tell their stories and answer big questions to meet and serve our community.
Join our growing Slack community
Join 5,000 tech professionals and entrepreneurs in our community Slack today!