ARTICLE
3 April 2025

Federal Court Dismisses Products Liability Challenge To Social Media Platforms' Content Moderation Tools

D
Dechert

Contributor

Dechert is a global law firm that advises asset managers, financial institutions and corporations on issues critical to managing their business and their capital – from high-stakes litigation to complex transactions and regulatory matters. We answer questions that seem unsolvable, develop deal structures that are new to the market and protect clients' rights in extreme situations. Our nearly 1,000 lawyers across 19 offices globally focus on the financial services, private equity, private credit, real estate, life sciences and technology sectors.
A recent federal court decision properly scrutinized, and then dismissed, a products liability claim against social media platforms, ruling that the plaintiffs did not clearly identify the product or design defect in question and that the defendants' content moderation processes are not subject to products liability laws.
United States Media, Telecoms, IT, Entertainment

Key Takeaways

A recent federal court decision properly scrutinized, and then dismissed, a products liability claim against social media platforms, ruling that the plaintiffs did not clearly identify the product or design defect in question and that the defendants' content moderation processes are not subject to products liability laws.

On February 24, 2025, Magistrate Judge Virginia DeMarchi in the United States District Court for the Northern District of California dismissed a collection of parents' products liability class action claims against YouTube and TikTok for failing to state a proper products liability claim. Bogard v. TikTok, No. 24-3131, 2025 WL 604972 (N.D. Cal. Feb. 25, 2025).

The plaintiffs brought claims against YouTube, TikTok, and their respective parent companies, alleging that the reporting features on the social media sites were defectively designed, which allowed videos they claimed to be harmful to remain on the platforms after they had been reported. See id. at *6. The companies argued that their reporting features were not "products" under the traditional definition of the term within products liability regimes. Id. The court agreed with the companies' conclusion and dismissed the case but differed as to the reasoning.

The court noted that while "a product" was traditionally defined in products liability as "tangible personal property distributed commercially for use or consumption," courts have also considered whether the context of the item is similar to that of tangible personal property.Id. at 11(internal citations and quotations omitted). Looking to the recent social media multidistrict litigation decision, In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, the court determined that "in certain circumstances a reporting tool could be a defective product." Id. at *6-7 (internal citations and quotations omitted). Yet, while a reporting tool could be a product under this evolving theory of products liability, what the plaintiffs in this case were challenging was not the tool itself, but rather the decision that the social media sites made once a video was reported to them. Thus, the court held that the plaintiffs were not challenging a defective design, but their disagreement with the companies' decision-making after videos were reported. Id. As the court reasoned:

[T]his is an objection to Defendants' decisions, after receiving Plaintiffs' reports, to remove or not remove certain videos; it is not an objection to the functionality of the reporting tool itself. . . . Such allegations fail to state a claim under products liability law.

Id.at 12. In sum, the dismissal of plaintiffs' products liability claims underscores the nuanced and evolving interpretation of what constitutes a "product" under products liability law.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More