- within Intellectual Property topic(s)
- in United States
- with readers working within the Metals & Mining, Retail & Leisure and Law Firm industries
- within Intellectual Property and Energy and Natural Resources topic(s)
In Advance Local Media LLC v. Cohere Inc., Judge Colleen McMahon of the Southern District of New York denied Cohere's motion to dismiss claims brought by a consortium of major news publishers. While the case includes robust copyright allegations, its most novel aspect lies in the Court's refusal to dismiss the plaintiffs' trademark infringement and false designation of origin claims under the Lanham Act.
The plaintiffs—fourteen major news organizations—alleged that Cohere's large language model, Command, generates hallucinated news articles that are falsely attributed to real publishers using those publishers' registered trademarks. These hallucinated articles, according to the complaint, often include publisher names in headlines or bylines, replicating the look and feel of legitimate journalism. The plaintiffs contend that this practice misleads users into believing that the fabricated content originates from, or is endorsed by, reputable news outlets, thereby causing consumer confusion and reputational harm.
Judge McMahon agreed that these allegations were sufficient to state a claim under both § 1114(1) and § 1125(a)(1)(A) of the Lanham Act. She rejected Cohere's argument that hallucinated outputs were not "use in commerce," emphasizing that the Command platform is offered as a commercial product, including paid versions designed to drive revenue. The Court further held that the unauthorized reproduction of publishers' marks in hallucinated outputs plausibly created a likelihood of consumer confusion—particularly where those outputs mimicked the structure and tone of real journalistic content and are "likely to divert traffic, sales, and subscriptions from" Plaintiffs.
In doing so, the Court relied on false designation doctrines—typically applied in contexts like mislabeled goods, false endorsements, or spoofed domains—into the realm of AI-generated text. The decision reframes hallucinated content not merely as a copyright issue, but also as a potential act of commercial misrepresentation, particularly when it bears the imprimatur of a known and trusted source. This significantly expands the potential liability landscape for AI developers, especially those whose models attribute generated content to real-world entities.
The Court also found that the nominative fair use doctrine does not apply. Cohere argued that the Lanham Act claims must fail because use of the marks is necessary to provide attribution for the news articles. The Court found the defense missed the mark because Plaintiff's allege false attribution of hallucinated articles, implying a false affiliation, which is actionable under the Lanham Act. "The nominative fair use doctrine does not allow a defendant to use a plaintiff's trademark to false attribute its own goods to the plaintiff. This is precisely the type of conduct the Lanham Act seeks to prevent."
The decision also denied Cohere's motion to dismiss the copyright claims—including direct and secondary infringement, as well as a novel "substitutive summary" theory.
Advance Local v. Cohere may be the first decision to hold that misattributed hallucinations may create Lanham Act liability, signaling that generative AI outputs implicating brand identity or source attribution are not immune from traditional trademark scrutiny. It also reaffirms that AI developers cannot necessarily shield themselves from commercial liability by treating hallucinations as technical accidents.
The case is Advance Local Media LLC, et al. v. Cohere Inc., Case No. 25-cv-1305 (CM) (S.D.N.Y. 2025)
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.