Last week, in United States v. Wesley Guard, the U.S. Court of Appeals for the Second Circuit held that the mobile messaging app Kik's review of a user's electronic accounts and data did not trigger Fourth Amendment protections. In the decision, the Circuit clarified that electronic communications providers (ECSPs) who independently screen their accounts for illegal content and then report it to government actors are not acting as agents of the government. This distinction is an important one because it means that ECSPs like Kik are not subject to Fourth Amendment scrutiny when they take affirmative steps on their own to rout out child sexual abuse material (CSAM) from their platforms.
In this case, defendant Wesley Guard appealed his conviction on multiple child exploitation offenses. Before Guard's arrest, Kik reported images containing CSAM to the National Center for Missing and Exploited Children (NCMEC). NCMEC shared Kik's reports with local authorities who opened a criminal investigation and eventually linked the CSAM images to Guard.
In the district court, Guard moved to suppress the electronic communications and files seized by Kik that were transmitted to NCMEC. He argued that when Kik scanned his accounts for contraband, Kik was acting as an agent of NCMEC — which Guard claimed was a government actor — and conducting an illegal search. The motion was denied by the district court and Guard was later convicted at trial of multiple counts. On appeal, Guard renewed his argument for suppression of the evidence seized by Kik.
While rejecting Guard's argument for suppression, the Second Circuit agreed that NCMEC is a governmental entity for Fourth Amendment purposes — a finding that may trigger more legal challenges to NCMEC's role as an intermediary between ECSPs and law enforcement. And although Guard was unsuccessful in doing so, account users charged with crimes may similarly argue that ECSPs are acting as NCMEC's agents in searching for illegal content and are likewise subject to a Fourth Amendment claim. Under federal law, ECSPs are required to report any files containing CSAM about which they have "actual knowledge" and face steep financial penalties for failing to do so. ECSPs are not required by statute to affirmatively screen for or search for CSAM, but many ECSPs do so to protect their platforms from illegal content. Kik, like other ECSPs, uses a software program called PhotoDNA to identify known CSAM images through hashing technology. When CSAM images are located, they are removed by Kik from the public platform and preserved for evidentiary purposes. Then, in compliance with federal law, the images are reported to NCMEC along with subscriber and activity data for the associated user account. NCMEC is then statutorily required to make that report available to law enforcement for review and potential criminal investigation.
For ECSPs, the TL;DR from the Second Circuit's decision is that so long as ECSPs are acting independently from NCMEC and other government entities, the calculus for taking proactive steps to protect their platforms from CSAM and other illegal content has not meaningfully changed. To start with, the financial penalties resulting from failures to report CSAM are a stark reminder of the high risk — to users and companies alike — that inadequate controls pose. And the Second Circuit rejected Guard's argument that Kik was acting as NCMEC's agent and found that Kik was permitted to screen Guard's accounts for CSAM and then turn over the illegal content to NCMEC without implicating the Fourth Amendment. The court reasoned that Guard had made no showing that NCMEC was involved in Kik's decision to use PhotoDNA or had provided "significant encouragement" to Kik to use it. Where ECSPs act in concert with or at the direction of government actors, they may face civil litigation for damages under Bivens or 42 U.S.C. § 1983. But where ECSPs like Kik act on their own accord, the decision makes clear that ECSPs can continue to independently develop and implement their own protocols — including using hashing technology like PhotoDNA — to screen for and eliminate CSAM from their platforms without running afoul of the Fourth Amendment.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.