On the one hand, they say that they are just platforms for people who post content and, as platform providers, are not responsible for what appears. On the flip side (this isn’t always the case), they actively determine what shows up on their platforms, much like newspapers decide which stories to run.
Can social media platforms really say without a laugh that they are not responsible for what appears on their platforms when determining what constitutes appropriate content?
More from Giglio: The Sausage Making and President’s Legislation Build Back Better
Internet social media platforms enjoy extensive safe harbor protections from legal liability for any content that users post to their platforms. The arguments put forward by these platforms to avoid legal liability are set out in a sentence in section 230 of the Decency of Communications Act 1996: âNo provider of an interactive computer service shall be considered as the publisher or the speaker of any information provided by another information content provider. . âEssentially, section 230 gives websites immunity from liability for what their users post.
As Congress considers amending or repealing Section 230, perhaps an immediate step should be to give the Federal Communications Commission oversight of the platforms’ content decisions.
The Communications Decency Act was passed in 1996 when the Internet was in its infancy and Congress feared that subjecting hosting platforms to the same civil liability as all other businesses would stunt their growth. It was written before Facebook and Google existed.
Indeed, large technology companies benefit from a federal law that specifically protects them. The same deal is not available for media companies and traditional publishers. When you give platforms full immunity for content posted by their users, you also reduce their incentives to remove content that causes social harm.
The expectation of Congress in enacting Section 230 was at least twofold. First, he hoped that the protection from civil suits would inspire websites to create a family-friendly online environment that would protect children, hence the title of Good Samaritan for this section. Second, Congress hoped it would help the burgeoning Internet economy grow by providing it with partial protection from federal and state regulations.
Fast forward 25 years and things look a lot different from what they were in 1996. Section 230 protections are now hopelessly obsolete. The biggest and most powerful companies today are big tech companies that have huge resources and advanced algorithms that they use to help them moderate content. It’s time to rethink and revise protections.
There is a growing consensus for updating Section 230. Democrats and Republicans apparently agree that these companies should not be receiving this government grant without any liability and that they should moderate the content of the government. ‘a politically neutral way to provide “a forum for a genuine diversity of political discourse”. During his presidential campaign, President Biden said Section 230 should be “revoked, immediately.” Senator Lindsey Graham (R-SC) said: âSection 230 as it stands today must give way.
Before amending Section 230, Congress should ensure that changing it will do no more harm than good. As lawmakers grapple with whether Section 230 should be amended or repealed, a simple and immediate step to making big tech companies more transparent would be to require them to submit to an external audit by the Federal Communications Commission.
Such an approach is not perfect, of course, but it would require network platform companies to prove that their algorithms and content removal practices moderate content in a politically neutral manner, and not partisan instruments, and prioritize content. veracity and accuracy rather than user engagement.
This would be consistent with one of the conclusions of Congress when it enacted Article 230: âThe Internet and other interactive computer services provide a forum for a genuine diversity of political discussions, unique opportunities for cultural development and a myriad of avenues for intellectual activity. “
Joseph M. Giglio is Professor of Strategic Management at Northeastern University D’Amore-McKim School of Business.
This article originally appeared on The Patriot Ledger: Hold Google, Facebook, Twitter accountable for what happens on the platforms