Civic News
Federal government / Legal / Social media

Calls for change to Section 230 are growing. Here’s how the law shapes social media and speech

Here's a look at what the nearly 25-year-old liability shield has meant for social media companies, and how changes might affect startups.

Social media. (Photo by dole777 on Unsplash)
The fight for the future of the internet is increasingly playing out in the halls of the federal government.

For a few years dating to the early days of the Trump administration, proposals by the Federal Communications Commission to change “net neutrality” rules were the equivalent of the polar ice caps melting for those interested in preserving an open internet. But in true 2020 fashion, the spotlight shifted more recently to a potential change or repeal of a different provision that shapes the internet, known as Section 230.

That’s Section 230 of the Communications Decency Act of 1996. It grants internet service providers immunity from prosecution for what’s posted on their site. This law also applies to social media companies like Facebook and Twitter.

The act states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

It’s a law that absolves the social media platforms for legally responsibility for what their users say and do. Without it, there might not have been any social media to begin with — or, at least, platforms that looked a lot different. Companies might have seen too great a risk in assuming the responsibility for what their users posted. Nearly 25 years later, there’s a growing chorus of talks to change this law and although each side has different reasons they’re in agreement that it needs to change.

“It wasn’t a bad thing to do because it encouraged the development and growth of the internet as we know it,” Eric Easton, a professor of media law at the University of Baltimore School of Law, said of Section 230. “But after so many years, it’s time to take another look at that blanket immunity.”

In internet years, 1996 was a long time ago. Many users had AOL email addresses. It was the year between noted releases of Windows 95 and 97, which brought Internet Explorer front and center on PCs. It was a time when businesses were still experimenting with whether and how to make their offerings available online. Section 230 helped this grow, as the protections it offered led to an explosion of tools allowing users to post freely.

It wasn’t a bad thing to do because it encouraged the development and growth of the internet as we know it. But after so many years, it’s time to take another look at that blanket immunity.

A lot has changed since then: The internet is more essential in our lives now. With the social media boom, sharing and distributing content is increasingly done through central platforms, rather than individual websites. And in recent years, the increasingly prominent role of these platforms created tensions at the intersection of technology, democracy and speech.

Publishers vs. platforms

When it comes to Section 230, the issues at hand in part harken back to legal precedents that have shaped media law over decades, with a new call to update for the internet age. Publishers have long navigated this terrain, and cases continue to illustrate the potential stakes. One recent example came in 2016, when blog darling Gawker Media was held liable by a jury for an article where a journalist released a sex tape showing professional wrestling personality Hulk Hogan.

But, thanks to Section 230, social media companies have never faced such risk. If someone wanted to challenge a post on Facebook, “You couldn’t sue them,” said Easton. “Couldn’t sue them for libel, emotional distress, invasion of privacy. You simply couldn’t sue them. You could sue the speaker that posted the message to their site, but you couldn’t sue the platforms. And that’s contrary to the way we treat newspapers and legacy media in the past.”

So, if changes come to Section 230, imagine a world where, like a news publication with journalists, Facebook would potentially be responsible for the content their users post. Maybe that content is cyber bullying that resulted from rumors spread about a user, which in turn amounts to libel.

Facebook Community Boost looks to introduce social media tools. (Photo by Stephen Babcock)

Facebook came to Baltimore in 2018. (File photo by Stephen Babcock)

Along with the big platforms, this has implications for the businesses that are building new tech products where users post content.

Erik Feig, an attorney with of Towson-based law firm Nemphos Braue, has worked with technology companies and businesses for 25 years. He sees a change in Section 230 as something that startups should be aware of.

“Section 230 has been one of the building blocks in terms of being able to assess risk,” said Feig. “For new entrants or entrepreneurial companies, it can affect decisions of what their trajectory might be.”

Imagine how that shift in what’s allowed might affect the startups that are looking to build the next TikTok or Snapchat. Gawker had editors that okayed the article, and lawyers to defend them. But the company lost the $115 million lawsuit that resulted from the Hogan case. Now there’s no more Gawker.

The hope from lawmakers is that if the Facebooks and Twitters of the world faced more liability, the platforms would be moderated better. But how does the new kid on the block build that infrastructure before they even turn a profit, when they’re just an idea?

A $115 million payout might not seem like a lot for a wealthy, publicly-traded company. The companies probably pay more than that for snacks at the company campus. But before TikTok was the next big thing, could it have afforded a $100 million lawsuit? How many startups, before they break and get acquired by a bigger company or go public, can afford even a $100,000 lawsuit?

‘Handwriting on the wall’

With a focus that’s more on the big companies, D.C. appears headed for action. Despite legislative gridlock on a lot of issues, altering the immunity from liability for tech companies like Facebook and Twitter is one where both parties agree — though they have different reasons.

Republicans feel Section 230 allows social media platforms to censor conservative viewpoints and want it repealed or changed, so a fairness doctrine can be put into effect. President Donald Trump went so far as to issue an executive order asking the FCC to establish regulation that clarifies the policy in Section 230 that says a good faith effort must be made by online companies when deciding to delete or modify content.

Democrats, on the other hand, believe Section 230 allows for too much objectionable content like hate speech, harassment, and misinformation. As a candidate, former VP Joe Biden signaled he was in favor of revoke Section 230. Now that he is president-elect, an FCC that is more of a regulatory force is likely, though the question of whether the FCC has the power to make policy around Section 230 would likely be decided in the courts.

Another signal is coming from the companies.

“These huge platforms have huge lobbying arms, no question, but they’re not opposing revision of Section 230 in a reasonable manner,” said Easton. “They’ve seen the handwriting on the wall.”

It seems change is coming to Section 230, for better or worse.

Donte Kirby is a 2020-2022 corps member for Report for America, an initiative of The Groundtruth Project that pairs young journalists with local newsrooms. This position is supported by the Robert W. Deutsch Foundation.
Companies: Nemphos Braue / University of Baltimore
Engagement

Join the conversation!

Find news, events, jobs and people who share your interests on Technical.ly's open community Slack

Trending

How venture capital is changing, and why it matters

What company leaders need to know about the CTA and required reporting

Why the DOJ chose New Jersey for the Apple antitrust lawsuit

Take a look inside Loyola’s Baltipreneurs accelerator, from programing to pitch

Technically Media