In today's hyperbolic, 24/7 news cycle, every outlet has their bias. More than ever it seems that the news is based more in sensationalized headlines, designed more for generating clicks than for informing the readers.

At 1776 Analysis, we tend more toward an Originalist understanding of the proper size and scope of government.

Section 230: A Complicated Issue

Section 230: A Complicated Issue

If you are pro-Trump you may be fully behind his Executive Order on Preventing Online Censorship (EOPOC); and if you are anti-Trump you likely believe that the EOPOC is an attack on the First Amendment protections of the Freedom of Speech. In reality, the debate is much more complex.

To understand the President’s position on Section 230 of the Communications Decency Act (CDA) of 1996 (hereafter referred to as Section 230), we must first understand what Section 230 is and why it came to be.

 

The Origin Story:

Section 230 was inspired by a 1995 court ruling against Prodigy Services Company. An anonymous user posted a comment to the, “Money Talk,” site hosted by Prodigy. The post, "indicat[ed] that Stratton Oakmont, Inc., a Long Island securities brokerage firm, and its president, Daniel Porush, had committed criminal and fraudulent acts," (Digital Media Law Project). Stratton Oakmont argued that the allegations made in the anonymous post were false and filed a defamation suit against Prodigy.

Prodigy contended that they merely a distribution platform, where anyone could post comments. If Prodigy were found to be a, “distributor, it could not be held liable unless it knew or had reason to know about the allegedly defamatory statements," (Digital Media Law Project). 

Stratton Oakmont argued, however, that Prodigy was more than a distributor; Prodigy was a, “publisher.” Prodigy had, "content guidelines, which stated rules that users were expected to abide by, a software screening program which filtered out offensive language, and the employment of moderators or "Board Leaders" who were responsible for enforcing the content guidelines,” which made Prodigy much more than a distributor (Digital Media Law Project). 

The initial court ruling held that Prodigy was more than a distributor, pointing, "to Prodigy's creation of an editorial staff of Board Leaders who have the ability to continually monitor incoming transmissions." The court ensured, however, to note that, "bulletin boards should normally be considered distributors when they do not exercise significant editorial control, as Prodigy had done," (Digital Media Law Project). 

This ruling had deep implications for internet communications. The court held that, in essence, only on-line libraries were exempt from civil liability. Once a service provider began to moderate or editorialize content, they ceased to be a distributor and became liable for the content on their platforms. Congress, at the time, had a number of concerns regarding the ruling. 

If moderation of content meant that platforms would lose their immunity from civil liability, there were only two likely outcomes. Platforms would simply close down, or there would be no moderation of content. Enter Senator Ron Wyden (D-OR) and former Representative Chris Cox (R-CA) with Section 230. 

 

The Statute:

Section 230 sought to do two very important things, differentiate between service and content providers, and provide service providers with some level of content moderation rights without becoming content providers.

Section 230 states, "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider," (47 U.S.C. Section 203(c)(1)).

Section 230 goes on to establish the legal differences between an Interactive Computer Service (hereafter Service) and an Information Content Provider (hereafter Content Provider).

Service: "The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions," (47 U.S.C. Section 203(f)(2)).

 

Content Provider: "The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service," (47 U.S.C. Section 203(f)(3)).

In simpler terms, the Service is the database and underlying support infrastructure to which a database owner grants access; Content Providers then populate that database with content. This is basically what the court ruled in the case against Prodigy, determining that Prodigy was not just a Service because the moderation of content made Prodigy a de facto Content Provider. To provide latitude to Services, Section 230 added the following language.

"Civil liability: No provider or user of an interactive computer service shall be held liable on account of - any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected," (47 U.S.C. Section 203(c)(2)(a)).

This is the second paragraph of the, "Good Samaritan," clause which provides protections for the Service to moderate objectionable content.  It is important to note that the adjectives used to describe the content to which access can be restricted are harsh. It appears, as I have reviewed the text of the law and the reporting surrounding it, that the scope of moderation afforded to a Service was intended to be narrow. The legislation carved out protections for Services to remove pornographic material, terrorist propaganda and beheading videos, to censor threats and harassment of users, and other objectionable material of the like.

 

What's Happening Now:

Arguments are being made that the social media giants (Twitter, Facebook, Instagram, YouTube, and others) are acting outside the bounds of Section 230's "Good Samaritan," clause. President Trump's EOPOC stated it this way.

"Section 230 was not intended to allow a handful of companies to grow into titans controlling vital avenues for our national discourse under the guise of promoting open forums for debate, and then to provide those behemoths blanket immunity when they use their power to censor content and silence viewpoints that they dislike.  When an interactive computer service provider removes or restricts access to content and its actions do not meet the criteria of subparagraph (c)(2)(A), it is engaged in editorial conduct.  It is the policy of the United States that such a provider should properly lose the limited liability shield of subparagraph (c)(2)(A) and be exposed to liability like any traditional editor and publisher that is not an online provider," (EOPOC).

It is an interesting point, and one that is worthy of debate. What are the upper and lower limits of, "Good Samaritan," clause? Does, "otherwise objectionable," material mean anything that the Service finds objectionable, or does it mean highly objectionable? The difference is distinct, its 16th century nude paintings versus cinema quality pornographic material. Where do we draw the line on what moderation is permitted, and who gets to determine where it is drawn?

The Service providers are claiming that they are merely a Service provider acting within the scope of Section 230(c) protections. Whether they are or not seems to be the question the White House is asking. The construct of the EOPOC, however, leaves no doubt where the White House stands on the issue. The EOPOC does not, however, take any legal action against any of the social media companies. The EOPOC does call for the, "executive departments and agencies … ensure that their application of section 230(c) properly reflects the narrow purpose of the section and take all appropriate actions in this regard." At most, this should mean if a company is found to be moderating content outside the scope of Section 230(c), their immunity from civil liability would be stripped.  

Contrary to almost everyone else, I don't see this as a free speech issue. There are only two things you can be with respect to Section 230(c); a Service provider or a Content provider. Section 230(c)(2) outlines what, "Good Samaritan," content restrictions are limited too. If the Service provider operates within Section 230(c)(2) guidelines, they remain a Service provider; if not, they become a Content Provider assume the liability of Content providers. In neither case is the US Government bringing suit against a private company for their decisions on content or content moderation, and as such cannot be infringing on First Amendment rights.

(Opinion) Overwhelmed by the Media

(Opinion) Overwhelmed by the Media

Qualified Immunity: No Reasonable or Legal Justification

Qualified Immunity: No Reasonable or Legal Justification