ARTICLE
17 September 2024

The State Law Landscape After Justices' Social Media Ruling

CM
Crowell & Moring LLP

Contributor

Our founders aspired to create a different kind of law firm when they launched Crowell & Moring in 1979. From those bold beginnings, our mission has been to provide our clients with the best services of any law firm in the world through a spirit of trust, respect, cooperation, collaboration, and a commitment to giving back to the communities around us.
On July 1, the U.S. Supreme Court remanded a set of Florida and Texas laws restricting social medial companies' ability to curate, amend or edit online user content in Moody v. NetChoice and NetChoice v. Paxton...
United States California Florida New York Texas Media, Telecoms, IT, Entertainment

On July 1, the U.S. Supreme Court remanded a set of Florida and Texas laws restricting social medial companies' ability to curate, amend or edit online user content in Moody v. NetChoice and NetChoice v. Paxton, but in so doing noted that "to the extent that social media platforms create expressive products, they receive the First Amendment's protection."1

The NetChoice decisions likely mean that online operators have some First Amendment right to regulate and moderate content — for at least some of their features and products — even though the court expressly declined to answer how much, to what extent, and whether the Texas and Florida laws were constitutional.

The result is that we still don't know if Florida, Texas and other states' attempts at content moderation are constitutional, and all the while more states are getting into the business of regulating content moderation.

Social media companies and platforms are left with a patchwork of regulation, limited precedent and an uncertain future. This may cause disruption and extra costs to online businesses that, while operating on a national level, have to comply with each state's bespoke laws and definitions of social media company, user-generated content, and the NetChoice decision itself.

The Debate

State legislatures are stepping up to regulate content moderation in an effort to protect children, privacy and political discourse, but tech companies say it's expensive, business-killing censorship.

For decades, online actors were expected to self-regulate their content moderation practices.

The NetChoice cases throw into doubt whether state regulation will replace self-regulation. In addition to Florida and Texas, New York and California have enacted content moderation laws under the banner of protecting children online or protecting individuals' rights to political speech. The legislatures argue that tech companies should not be able to censor speech they do not agree with.

Challengers respond that the laws are virtue-signaling efforts that do not actually accomplish these goals at best, and quintessential censorship at the worst. These laws demonstrate a disconnect between tech companies that believe the legislatures are requiring them to publish speech that the tech companies feel is loathsome, inappropriate, hostile, violent or simply contrary to their terms and policies, and the legislatures that believe that Big Tech can influence elections and societal trends through their editorial and curatorial policies.

Notably, the Texas and Florida laws were passed after the 2020 election and viewed as a response to X and other platforms' banning of right-wing speech or speakers.

Next, despite these purported aims, the laws are overbroad and sweep all kinds of online players into their ambit, including e-commerce sites, dating apps, vacation rentals apps, and more; hardly the type of providers that are threats to child safety or freedom of political speech.

States define "social media company," "user generated content," and "restricted activities" differently, which was one of the very concerns raised and expressed by the justices in NetChoice. An unintended consequence of legislatures' zealous efforts to regulate Big Tech? Perhaps. But the consequences for e-commerce businesses and their customers could be significant.

For example, user-generated content abounds on e-commerce sites and even the most benign of the state laws would require significant overhauls to how those companies do business, adding time and cost to daily operations. For any company with an online presence, this presents an opportunity to educate legislators about the risks posed to their business by overbroad regulation of internet platforms.

Does the court's decision to remand NetChoice just send the issue to those equally ill-equipped to decide the matter? We are not necessarily convinced that the states are the best equipped to regulate, nor do we think a patchwork of varying state-level regulations makes sense. Perhaps Congress will wade in, but until such time, we are trending away from self-regulation, as states are clearly focused on regulating, leading to even more confusion and chaos than the era of self regulation wrought.

Notably, the states that have passed content moderation laws are the four most populous — Texas, Florida, California and New York. New Jersey is considering its own content moderation law as well. What they do, and where they legislate, others are bound to follow.

Because the state laws differ in their focus and particulars, complying with each of these regimes will be nearly impossible — perhaps even forcing companies to make a choice as to which laws they will follow, or expending more money and time trying to comply with all.

Here is a look at what has passed at the state level.

Florida

In 2021, Florida enacted S.B. 7072, restricting content moderation regarding candidates for political office and political speech.

The law broadly defines "social media platform" to mean any online providers that do business in Florida, have annual gross revenues in excess of $100 million, or have at least 100 million monthly individual platform participants globally. The law also prohibits the "willful deplatforming" of candidates for political office and authorizes the Florida Elections Commission to issue fines ranging from $25,000 up to $250,000 per day.

According to the bill, "deplatform" means "to permanently delete or ban a user or to temporarily delete or ban a user from the social media platform for more than 14 days."

The law also authorizes the attorney general to temporarily place platforms on an antitrust violator vendor list if they have probable cause to believe the platform is violating the law. It also authorizes users to bring claims, with statutory damages up to $100,000 per claim.

In NetChoice, the court implied that the Florida law is likely unconstitutional as applied to the Facebook news feed and YouTube homepage, but declined to determine if it was facially unconstitutional.2

Texas

In 2021, Texas enacted H.B. 20, which establishes complaint procedures and disclosure requirements for social media platforms' content moderation practices.

Texas defines "social media platform" as an "internet website or application that is open to the public, allows a user to create an account, and enables users to communicate with other users for the primary purpose of posting information, comments, messages or images, with enumerated exceptions."

The law prohibits social media companies from taking content moderation actions based on "the viewpoint of the user or another person ... the viewpoint represented in the user's expression ... or a user's geographic location" in Texas.

H.B. 20 authorizes the Texas attorney general to seek injunctions against platforms, and Texans who have been wrongfully censored due to their political ideology to sue for declaratory and injunctive relief. Like with the Florida law, the NetChoice court expressed doubt that the law was constitutional as applied to Facebook and YouTube, but also remanded the case for further proceedings.3

New York

New York initiated content moderation legislation following the 2022 Buffalo, New York, mass shooting.

Section 394-ccc of New York's General Business Law Code requires social media companies to provide and maintain mechanisms for users to report hateful conduct on their platforms.

This law defines "social media network" to mean for-profit operators that host content shared with the public. New York's attorney general can seek civil penalties up to $1,000 per violation per day. Free speech advocates challenged the law on constitutional grounds, securing an injunction in 2023, in Volokh v. James in the U.S. District Court for the Southern District of New York. That decision is pending.4

In June, New York enacted S.B. S7694A, or the Stop Addictive Feeds Exploitation for Kids Act, which prohibits social media platforms from: (1) providing an addictive feed, or algorithmically curated feed, to children younger than 18 without parental consent, and (2) withholding nonaddictive feed products or services where that consent is not obtained.

This bill also authorizes the New York attorney general "to obtain relief, including damages" for violations, but does not specify the penalties or relief available. The law has not yet been challenged, though commenters anticipate that it will be contested on First Amendment grounds.

While Attorney General Letitia James continues to defend Section 394-ccc on appeal before the U.S. Court of Appeals for the Second Circuit, she has not yet initiated any enforcement actions under S.B. S7694A.

California

Are NetChoice's reverberations being felt in California?

California's A.B. 587, requires social media companies that generated more than $100 million in the preceding calendar year to publish terms of service for their platforms and submit reports to the California attorney general detailing their content moderation practices and data related to violations of their terms of service.

Targets of the law are internet-based services or applications with users in California that allow users to connect with one another or otherwise host content. Only the attorney general or city attorneys can enforce, either of whom may seek civil penalties up to $15,000 per violation per day and injunctive relief for violations.

Unlike the Texas and Florida laws, A.B. 587 does not restrict or prescribe content moderation actions. It only imposes transparency requirements and expressly declines to regulate platforms' management of user-generated content. But the law is under attack.

On Sept. 4, the U.S. Court of Appeals for the Ninth Circuit enjoined much of the law on First Amendment grounds in X Corp. v. Bonta.5 The court held that the requirements that social media companies comply with, and make disclosures about, their content moderation practices and make disclosures about them amounts to compelled speech, which is in turn subject to strict scrutiny.

While the court raised NetChoice in the opinion, it did not base its ruling on that precedent or reasoning.

California also recently enacted A.B. 2273, requiring companies that provide online services or products likely to be accessed by children to configure default privacy settings to the highest possible level of privacy and provide privacy information and other policies prominently in terms that children can understand. Before unveiling any new online services or products, companies must complete a data protection impact assessment and provide the findings to the attorney general.

On Aug. 16, the Ninth Circuit partially affirmed a September 2023 injunction of the law in NetChoice v. Bonta, on the grounds that the data protection impact assessment requirement likely facially violated the First Amendment by compelling speech from the platforms. The three-judge appellate panel vacated the remainder of the district court's preliminary injunction and remanded for further proceedings.

States Versus US?

State attorneys general are standing behind their legislatures and litigating the validity of these laws.

Twenty-two attorneys general, led by New York Attorney General Letitia James, filed an amicus brief in NetChoice, urging the Supreme Court to make it clear that states have the authority to regulate social media platforms, arguing that their laws do not impinge on the First Amendment or that social media companies are not immunized under the First Amendment.6

And in Gonzalez v. Google, which the Supreme Court decided in May 2023, a bipartisan coalition of more than two dozen state attorneys general filed an amicus brief, asserting that Section 230's far reaching scope of immunity, as interpreted by the Ninth Circuit and other courts, prevents states from allocating "losses for internet-related wrongs,"7 going so far as to also argue that Section 230 of the Communications Decency Act should not protect social media platforms' editorial decisions.

Notably, these positions often make state attorneys general adverse to the U.S. Indeed, in NetChoice, the government sided with NetChoice.

NetChoice arms social media companies with another tool to combat encroachment upon content moderation. While NetChoice did not define the bounds of First Amendment protections for expressive compilations, it is clear that at least some products and platforms are protected.

Even if Congress does away with Section 230, or abridges it in the future — as some conservative members of Congress have threatened — some online companies should continue to have some First Amendment protections in their arsenal against regulatory or legislative incursions.

Conclusion

For now, the states have taken the mantle in legislating in this area.

In sending NetChoice back to the lower courts, the justices may have thought they had washed their hands of this issue — at least for a while. We don't think this will last for long: With varying state regulations in the books, we expect this issue to make its way back to the Supreme Court soon.

So, the Supreme Court may ultimately decide whether state regulation, as opposed to self-regulation or federal regulation, reigns supreme.

The opinions expressed are those of the author(s) and do not necessarily reflect the views of their employer, its clients, or Portfolio Media Inc., or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.

Footnotes

1. NetChoice, 630 U.S. _ at slip op. at 2 (2024).

2. NetChoice, 630 U.S. at slip op. at 3.

3. NetChoice, 630 U.S. __ at slip op. at 10.

4. https://www.thefire.org/sites/default/files/2022/12/Complaint%20%20Volokh%20v.%20James.pdf.

5. X Corp v. Bonta, 9th U.S. Circuit Court of Appeals, No. 24-271.

6. Attorney General James Urges U.S. Supreme Court to Ensure States Can Regulate Social Media Platforms (ny.gov): https://ag.ny.gov/press-release/2023/attorney-general-james-urges-us supreme-court-ensure-states-can-regulate-social.

7. Brief for the States of Tennessee et al. as Amici Curiae in Support of Petitioners at 9,Gonzalez v. Google LLC, 143 S. Ct. 762 (2023) (no. 21-1333)

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More