A recent federal appeals court decision may lead online platforms that post user-generated content filtered by moderators to think twice before posting copyrighted material. In Mavrix Photographs, LLC v. LiveJournal, Inc., the Ninth Circuit Court of Appeals recently held that the Digital Millennium Copyright Act's ("DMCA") safe harbor for "infringement of copyright by reason of the storage [of material] at the direction of a user" may not protect moderated online platforms.

This suit arose out of allegations of copyright infringement by Mavrix Photographs, LLC ("Mavrix"), "a celebrity photography company specializing in candid photographs of celebrities in tropical locations," against LiveJournal, Inc. ("LiveJournal"), a curated social media platform.  Mavrix alleged that "Oh No They Didn't!" ("ONTD"), a popular LiveJournal community dedicated to celebrity news, posted twenty Mavrix-copyrighted photographs between 2010 and 2014, in violation of the DMCA.  ONTD generates content by posting by submissions from its users.  When a user submission is received, ONTD's team of volunteer moderators, who are overseen by a full-time LiveJournal employee, reviews the submission and decide whether it should be publicly posted on ONTD.

In this case, the Ninth Circuit considered whether the district court properly granted summary judgment to LiveJournal on the basis that DMCA's safe harbor for posting infringing material "at the direction of a user" applied to ONTD.  The Ninth Circuit reversed the district court, holding that the court's analysis of whether ONTD's content was generated "at the direction of a user" improperly focused on the submission of infringing material, rather than on the process for posting the material.  The court further held that the district court improperly rejected Mavrix's argument that LiveJournal was responsible for the acts of its moderators under the common law of agency.  Because of this, the court found that – as a threshold matter – genuine issues of material fact existed regarding whether the LiveJournal moderators acted as agents of LiveJournal.

Should the district court find that LiveJournal's moderators are its agents, which would open the possibility of LiveJournal being liable for the posting of infringing material on ONTD, the court explained that a fact-finder must then determine "whether Mavrix's photographs were indeed posted at the direction of the users in light of the moderators' role in screening and posting the photographs." If the moderators' conduction related to screening ONTD user submissions for public posting is found to be more than "merely accessibility-enhancing activities," the "at the direction of a user" safe harbor cannot apply.

Despite finding two issues of material fact – the agency status of ONTD moderators and whether ONTD's content is posted "at the direction of a user" – the court went on to discuss the legal standards governing the remaining elements of the safe harbor. If the threshold requirement that the infringing content is posted "at the direction of a user" is met, LiveJournal will still need to show that: (1) it lacked actual and "red flag" knowledge of the infringements and (2) "did not financially benefit from infringements that it had the right and ability to control."  Regarding the element of knowledge, the court noted that showing a lack of "red flag" knowledge requires LiveJournal to demonstrate that "it would [not] be objectively obvious to a reasonable person that material bearing a generic watermark or a watermark referring to a service provider's website was infringing."

Takeaways: This decision presents two potentially serious threats to online platforms that depend upon user submissions to generate content, including sponsors of user-generated content contests. First, the Ninth Circuit's determination that LiveJournal may be liable for the posting of infringing user-submitted content because its moderators may be deemed agents of LiveJournal presents risks to online platforms (and advertisers) that thrive on moderated user-submitted content.

Second – and more importantly – the Ninth Circuit's guidance on the "at the direction of a user" DMCA safe harbor narrows the previously understood scope of this provision and should lead moderated platforms to think seriously about the level of oversight currently exercised by its moderators, or reconsider whether to use moderators at all. In particular, this decision suggests that moderation of user-submitted content that goes beyond performing "accessibility-enhancing activities," such as manually reviewing submissions or conducting a substantive review of the submissions, may take a platform outside the DMCA safe harbor and expose the platform to liability for posting copyrighted material submitted by users.  This narrow reading of the DMCA safe harbor may also pose risks for advertisers who sponsor contests and marketing promotions that involve producing a moderated selection of user-generated content.

This article is presented for informational purposes only and is not intended to constitute legal advice.