ARTICLE
4 June 2025

Can Providers Be Sued For Mistaken CSAM Reports? Maybe, Says New Ruling

PC
Perkins Coie LLP

Contributor

Perkins Coie is a premier international law firm with over a century of experience, dedicated to addressing the legal and business challenges of tomorrow. Renowned for its deep industry knowledge and client-centric approach, the firm has consistently partnered with trailblazing organizations, from aviation pioneers to artificial intelligence innovators. With 21 offices across the United States, Asia, and Europe, and a global network of partner firms, Perkins Coie provides seamless support to clients wherever they operate.

The firm's vision is to be the trusted advisor to the world’s most innovative companies, delivering strategic, high-value solutions critical to their success. Guided by a one-firm culture, Perkins Coie emphasizes excellence, collaboration, inclusion, innovation, and creativity. The firm is committed to building diverse teams, promoting equal access to justice, and upholding the rule of law, reflecting its core values and enduring dedication to clients, communities, and colleagues.

Can a communications provider be held liable when it reports to the National Center for Missing and Exploited Children (NCMEC) an image the provider believes to be child sexual abuse material based on signals provided by NCMEC?
United States Privacy

Can a communications provider be held liable when it reports to the National Center for Missing and Exploited Children (NCMEC) an image the provider believes to be child sexual abuse material based on signals provided by NCMEC?

If a recent decision in the Middle District of Florida stands, "yes" appears to be the answer. The opinion may have broad implications for providers that use image hashes to identify potentially reportable child sexual abuse material (CSAM) and the risk of liability for mistaken or unfounded CyberTipline Reports. The case is Lawshe v. Verizon Comm.'s, Inc. and Synchronoss Tech, Inc., No. 24-137, 2025 WL 660778 (M.D. Fla. Feb. 28, 2025).

Backing up a bit, federal law requires online communications and storage providers to submit what's known as a CyberTipline report to NCMEC when the provider has "actual knowledge" of an apparent violation of federal CSAM laws. NCMEC then reviews that report, adds more information, and forwards it to law enforcement. Providers often rely on hashes - i.e., a unique digital representation of a file that can be used to identify exact copies of that file - to identify known CSAM (that is, CSAM images and videos for which a hash has already been created). Some providers create and use their own hashes; others use hashes provided by third parties; and some use hashes from a database hosted by NCMEC. That database is at issue In Lawshe.

In Lawshe, a federal district court held that:

(1) if a provider uses the NCMEC hash database to identify potentially reportable CSAM,

(2) does not confirm that an "unconfirmed" hash match is actually reportable CSAM, and

(3) that CSAM ends up not actually being child pornography under federal law, then

(4) the provider can face claims and potential liability for wrongful disclosure under the Stored Communications Act if it reports the "unconfirmed" content to NCMEC without first confirming that it is actually reportable.

It's a potential shift in liability risk associated with hash matching and auto-reporting.

In the case, Verizon reported to NCMEC images that plaintiff stored on Verizon's cloud because those images matched NCMEC hashes. Plaintiff alleged that he stored only "legal pornographic pictures depicting consenting adult models on Verizon's cloud." Id. at *2. But Syncross, Verizon's cloud service contractor, used hash matching to determine whether any materials on the cloud were CSAM, relying on the NCMEC hash list (which includes tags such as "apparent" and "unconfirmed"). Plaintiff alleged that on a hash match, defendants would send a CyberTipline Report to NCMEC "instantly, without human review or collecting any information about the subject images other than what the hash match itself provides." Id. at *2 (internal quotations omitted). Here, two of plaintiff's images were allegedly flagged incorrectly as CSAM: one categorized as "apparent" CSAM and the other as "unconfirmed" CSAM. Both were reported to NCMEC, ultimately resulting in plaintiff's arrest. Notably, plaintiff alleged that in the CyberTipline Reports, defendants claimed they reviewed the images even though they didn't.

Plaintiff sued Verizon for defamation and unlawful disclosure under Section 2702 of the SCA. Defendants argued that they had immunity under 18 USC 2258B because the report arose from their reporting and preservation responsibilities under Section 2258A, and that disclosure was permitted by Section 2702(c)(6) of the SCA (which allows providers to disclose content to NCMEC as required by 18 USC 2258A).

The Court held that defendants had 2258B immunity only for reporting the "apparent" CSAM image but not the "unconfirmed" one. Under the court's logic, Section 2258B confers immunity only when a provider relies reasonably on information indicating that the image is CSAM (i.e., a hash match with an "apparent CSAM" tag), but not when the provider relies on uncertain indications that the image could, potentially, be CSAM (i.e., a hash match with an "unconfirmed" tag). In the latter case, the court suggests that providers should review the images first to confirm whether they are CSAM. Moreover, the court also held that defendants lacked immunity under 2258B because they acted with "actual malice" by saying they reviewed the image when they didn't (suggesting that 2258B immunity may hinge in part on whether the provider's CyberTiplie Report contains material inaccuracies). And the court also held that the NCMEC disclosure exception to the SCA, 18 USC 2702(b)(6), is "coextensive with providers' reporting responsibilities under § 2258A," and doesn't apply if a provider discloses to NCMEC content that turns out not to be CSAM based on an "unconfirmed" tag. Finally, the Court held that the "good faith" provision in 2707(e) is an affirmative defense not available on a 12(b)(6) motion unless it clearly appears on the face of the complaint, which it did not here.

This opinion suggests that providers risk SCA liability if they disclose to NCMEC without reviewing the image first, at least when such image is "unconfirmed" as potential CSAM. It also raises the question of whether, under this ruling, Section 2258A imposes a tacit obligation on providers that learn of an "unconfirmed tag" to investigate further and review the image, or whether the provider can simply ignore it because it's not "actual knowledge."

Verizon filed a Motion for Reconsideration following this decision, arguing that the Order was "based on an erroneous assumption" that NCMEC tells Verizon when images are "CSAM–unconfirmed"–it claims it does not–and, from a policy perspective, taking the Order to its logical conclusion could risk "chilling provider efforts to prevent, curtail, or stop" CSAM reporting, including the practice of matching to hash values from NCMEC. Lawshe disagreed, filing a Response to Verizon's Motion for Reconsideration saying that a motion for reconsideration was an inappropriate vehicle for these arguments, dismissal is inappropriate given the Court based its findings on "actual malice," and public policy considerations support Lawshe's position, too: sending false leads to NCMEC injures people like Lawshe, but also burdens law enforcement resources, taking time away from actual exploitation.

The case has had little movement since early last month, but any decision by the Court could affect future use of reporting based on hashes, auto-reporting, and any practice of non-human review.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More