States and Congress have enacted or discussed different approaches to moderating online content through social media and other Internet platforms. California’s “Content Moderation Requirements for Internet Terms of Service” bill (“AB 587”) will go into effect on January 1, 2024. In short, AB 587 requires social media companies to disclose their processes for removing or managing content and user platforms. AB 587 takes a slightly different approach to regulating social media content than previously enacted laws in Texas and Florida. Texas and Florida laws also address the content management practices of social media companies, but go beyond requiring disclosure and also prohibit certain behaviors to limit alleged viewpoint discrimination. The Eleventh Circuit opposed the Florida law in part because its content moderation requirements violate the right of social media companies to exercise editorial judgment on their platforms. The Fifth Circuit, on the other hand, upheld a similar Texas law because the court believed that content moderation based on viewpoint would constitute censorship and that a platform’s content moderation activity is not protected by the First Amendment.
Section 230 of the Communications Decency Act was read to give social media companies broad immunity from managing third-party posts or content on their platforms. Importantly, Section 230 protects social media companies from liability for “any measure voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers obscene, lewd, lascivious, dirty, excessively violent, harassing or otherwise objectionable, whether or not such material is constitutionally protected.”
Florida and Texas attempted to regulate the management of third-party posts or content by social media companies by passing “content moderation” or “transparency” laws. Both laws focus on 1) content moderation; 2) Disclosure requirements about internal policies on how content moderation and censorship decisions are made; and 3) banning, banning or removing Users and their Content from the Platform.
As mentioned above, both laws have been challenged on constitutional grounds. In May 2022, the Eleventh Circuit confirmed entry of an injunction by a district court barring enforcement of certain provisions of Florida law relating to content moderation. The central view of the Eleventh Circuit was that the platforms’ content moderation was protected speech, similar to the editorial powers exercised by traditional media companies. In contrast, in September 2022, the Fifth Circuit ruled that a district court erred in imposing the Texas law, arguing that the Texas law does not violate the First Amendment rights of technology platforms. The Fifth Circuit’s lengthy statement is not easily summed up, but the gist of the statement is “reject[s] the idea that corporations have a free-ranging First Amendment right to censor what people say.” The court also found that Texas law “doesn’t chill speech; if anything, it chills the censorship” and “doesn’t regulate the platforms’ speech at all; it protects other people Language and regulates the platforms behavior.” The Fifth Circuit also compared social media companies to “ordinary network operators” and drew on this analogy to claim that the government can largely limit its content moderation activities. The Fifth Circuit therefore rejected the challengers’ claim that Texas law conflicted with the First Amendment. Notwithstanding this ruling, Texas law has not yet come into effect. The Fifth Circuit granted a stay of its sentence in mid-October and agreed to stay enforcement of the law pending further review by the Supreme Court.
While the new California law is designed to require transparency regarding content moderation practices rather than restrict editorial decisions, California law may also face litigation under the First Amendment and potentially Section 230 as well.
Who is covered?
AB 587 is intended to cover social media businesses, a term broadly defined to mean “any person or entity that owns or operates one or more social media platforms.” A social media platform is defined as “a public or semi-public internet-based service or application with users located in California” and aims to connect users within the service or application and allows users to create a profile to interact with others and user-generated content.
This definition seems to cover a wide range of companies, including message board companies, platforms like Instagram, Tik Tok, and Meta. However, legislative history suggests that AB 587 should be interpreted more narrowly. In particular, legislative history shows that the legislature intended to refer to an Internet-based service or application for which interactions between users are limited to direct messages, commercial transactions, consumer reviews of a product, vendor, service, event or location, or some combination thereof . This could potentially mean that even prominent companies like Amazon or Yelp would not be considered social media companies owning social media platforms.
AB 587 requirements
Under AB 587, a social media company must publish its terms of service and (1) provide contact information so users can ask questions about the terms of service; (2) a description of the content moderation policy; and (3) a list of potential actions a social media company may take against prohibited content. Users of the platform have broad rights under AB 587, including the right to flag content that they believe violates the Terms of Service and to obtain an explanation of a decision made by the social media company. In particular, there is no right to private prosecution under AB 587; Provisions that would have created a private right of action were removed in the legislative process. There appears to be no regulation or guidance currently in place to implement the draft law; however, there is no indication that such guidance or regulations are precluded or will not be provided at a later date.
There is a $15,000 fine per violation of AB 587 requirements. A social media company is considered a violation of the law for each day that the social media company does any of the following:
State legislatures and federal courts take different approaches to regulating social media. As a result of the litigation in the Fifth and Eleventh Circuit Courts, there is a schism over the issue of “content moderation” laws that could prompt the Supreme Court to consider these issues soon. In the meantime, other courts and state legislatures, and possibly the US Congress, may consider other laws and legal decisions. Each of these developments will shape the legal landscape and inform how social media content moderation laws are applied and analyzed under the First Amendment and Section 230 of the Communications Decency Act. Social media companies should continue to monitor developments in this area and prepare to respond to a variety of content moderation laws and plan with the possibility that these laws may impose different legal obligations and possibly even obligations that contradict each other.