When the President issues an Executive Order asking for examination of Section 230 of the Communications Decency Act, which permitted the growth of so many Internet companies, broadcasters and other media companies ask what effect the action may have on their operations. On an initial reading, the impact of the order is very uncertain, as much of it simply calls on other government agencies to review the actions of online platforms. But, given its focus on “online platforms” subject to the immunity from liability afforded by Section 230, and given the broad reach of Section 230 protections as interpreted by the Courts to cover any website or web platform that hosts content produced by others, the ultimate implications of any change in policy affecting these protections could be profound. A change in policy could affect not only the huge online platforms that it appears to target, but even media companies that allow public comments on their stories, contests that call for the posting of content developed by third parties to be judged for purposes of awarding prizes, or the sites of content aggregators who post content developed by others (e.g. podcast hosting platforms).
Today, we will look at what Section 230 is, and the practical implications of the loss of its protections would have for online services. The implications include the potential for even greater censorship by these platforms of what is being posted online – seemingly the opposite of the intent of the Executive Order triggered by the perceived limitations imposed on tweets of the President and on the social media posts of other conservative commentators. In a later post, we’ll look at some of the other provisions of the Executive Order, and the actions that it is asks other government agencies (including the FCC and the FTC) to take.
Section 230 provides broad protections to providers or users of an “interactive computer service” who, according to the Act, shall not be treated as the “publisher or speaker of any information provided by another information content provider.” An interactive computer service is defined broadly as well:
any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.
The Courts have interpreted that language to cover virtually all websites and any other electronic platform that makes available content accessible by the public. An information content provider is essentially anyone who develops content that is posted on one of these interactive computer services:
any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.
These provisions, read together, allow the development and operation of online sites where users can post content without subjecting the host to liability for what is contained in the user generated content on their site. That is the import of the section that says that they are not treated as the “publisher or speaker” of the content created by others.
There are exceptions to the Section 230 insulation from liability. Intellectual property issues are not covered by these protections. Instead, the content hosts are protected by a “safe harbor” created under Section 512 of the Digital Millennium Copyright Act that operates similarly to Section 230, though it imposes some additional obligations on the platform operator. See our articles here and here on the DMCA safe harbor, and the recent report on those protections issued by the Copyright Office, which we hope to discuss in a future post. Criminal conduct is also excluded. Content related to sex trafficking has also been excluded from its protections.
But Section 230 does protect the website hosts from civil liability for the third-party content under both federal and state laws. Where this often comes up is in the application of defamation law – where the website owner is not liable for libel or slander of those who post content on its site. But it comes up in the application of many other state and federal laws, giving sites broad immunity for the content of material posted on the site with minimal limitations imposed by some courts, e.g. the potential loss of protections if the site effectively solicited the illegal content (as in the case of a website developed to facilitate finding a roommate being found potentially liable for violation of housing discrimination laws when it specifically solicited information about the race of those looking for rooms).
Even though sites can’t solicit illegal content, they are given permission to themselves edit or curate the content. The idea was that sites should not be penalized for being “Good Samaritans” by trying to eliminate content that could be potentially damaging to their viewers. One concern of Congress was the protection of children, though the statutory language is broader:
No provider or user of an interactive computer service shall be held liable on account of…any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected
Stated simply, the heart of Section 230 was to facilitate the development of websites and online platforms where third parties can post information. As stated in the Act setting out the policy on which it is premised:
Policy. It is the policy of the United States—
-
- to promote the continued development of the Internet and other interactive computer services and other interactive media;
- to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation;
The application of Section 230 allows websites and other Internet platforms to host content created by others without having to review and determine whether everything posted on the site raises the potential for civil liability. For the big online platforms, one can imagine the massive resources that would be required for making that assessment for all content posted on their platforms. But even on the site of a small regional newspaper or a local television station, having to review all letters to the editor or shared news posts that may come from third parties can put a burden on the site that might well make the site owner decide that the value of the content was not worth the trouble that it poses to police.
This is particularly evident when we get into political speech. Broadcast stations and cable systems, as we wrote here, currently have protections akin to those afforded by Section 230 for political advertisements sponsored by candidates and their authorized campaign committees. Stations are not allowed to censor that content, and therefore have no liability for the contents of what is said in a candidate ad. By contrast, when they air ads that are not sponsored by candidates, stations do have the right to censor the content. As such, when they are put on notice that the content of a third-party attack ad is potentially defamatory, the station must review the ad and make a decision whether or not to run the ad, as once on notice of its falsity, they have potential liability for its contents. See our articles here and here on the elaborate steps that stations need to go through in making these decisions. And if they guess wrong, or even arguably wrong, they can be subject to lawsuits as is currently the case for a Wisconsin TV station that aired an ad that the President’s campaign committee alleges is defamatory (see our article here).
The requirement for notice or knowledge of the potential falsity arises from the “malice” standard that applies to defamation of public figures. But for defamation of individuals who are not in the public eye, or for the application of many other causes of action that could impose civil liability for something said or portrayed in some online third-party post, notice of the violation may not be a necessary precondition to liability for someone who publishes the material. Thus, everything would need to be reviewed. And because everything that is posted could pose a risk, sites without Section 230 protections are much more likely to be aggressive in taking down anything that is close to the line where potential liability could be raised.
Thus, if the Executive Order did result in fewer sites having Section 230 protections, it could well result in even less speech being posted online than currently available, and more content being “censored” – seemingly the opposite effect that the order allegedly seeks. Of course, before that happens, there are many other actions necessary before this result would occur – including the request that the FCC commence a rulemaking to review the application of Section 230. Those issues will be discussed in a future post.
Courtesy Broadcast Law Blog