- within Technology topic(s)
- in United States
- with readers working within the Property and Telecomms industries
From Nov. 3 through Nov. 5, the Windy City welcomed the ANA Masters of Advertising Law Conference. While the conference focused on all things advertising and marketing, one concept seemed to dominate nearly every single panel – AI. Nearly every presentation had some mention of the ways those two letters have changed the game over the past year. Both states and the federal government are finding ways to regulate how AI interacts with advertising and digital media, and in turn how it interacts with consumers.
States
States continue to lead the charge in passing legislation to regulate consumer-facing AI. For example, in September 2024, California passed a set of bills, AB 2602 and AB 1836, that attempt to regulate the use of generative AI in the creative process. CA AB2602 renders unenforceable contractual provisions related to use of a digital replica in a new performance, fixed on or after Jan. 1, 2025, that replaces live performances, unless (a) the contract contains a specific description of the intended use (or the use is consistent with the terms of the contract and the "fundamental character" of the photography or soundtrack as recorded or performed) or (b) the individual was represented by legal counsel or a union. The law defines "digital replica" to mean "a computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual that is embodied in a sound recording, image, audiovisual work, or transmission in which the actual individual either did not actually perform or appear, or the actual individual did perform or appear, but the fundamental character of the performance or appearance has been materially altered." CA AB 1836 borrows the definition of digital replica from CA AB2602 but applies it to the creation of digital replicas of the deceased. Under CA AB 1836, the estate of the deceased must provide consent for the creation of the digital replica.
In New York the State Assembly and Senate passed NY AB A8887B, which requires advertisements that feature synthetic performers to conspicuously disclose in the advertisement that a synthetic performer is featured. As used in this New York bill, a "synthetic performer" is a digitally created asset created, reproduced or modified by computer, using generative AI or a software algorithm that is intended to create the impression that the asset is engaging in an audiovisual and/or visual performance of a human performer who is not recognizable as any identifiable natural performer. As of the posting of this blog, NY AB A8887B is awaiting signature by Gov. Kathy Hochul.
Federal
Not to be outdone, the federal government has also taken small steps to regulate and investigate AI. The TAKE IT DOWN Act was signed into law on May 19, 2025, and is aimed at curbing the publication of nonconsensual intimate imagery (often called "revenge porn"). We previously released a podcast episode that discusses the TAKE IT DOWN Act in great detail. The TAKE IT DOWN Act prohibits any person from knowingly publishing a digital forgery created without consent of the depicted, with the intention of causing harm. The law defines "digital forgery" as any intimate visual depiction of an identifiable individual created through the use of software, machine learning, AI, or any other computer-generated or technological means, including by adapting, modifying, manipulating or altering an authentic visual depiction that, when viewed as a whole by a reasonable person, is indistinguishable from an authentic visual depiction of the individual. Platforms covered by this law are required to establish notice and takedown mechanisms for users to report offending depictions by May 19, 2026. Covered platforms are given just 48 hours to remove the offending material from their platform or can face penalties, as the Federal Trade Commission (FTC or the Commission) is endowed with enforcement power.
Speaking of the FTC, Commissioner Melissa Holyoak attended ANA virtually and presented with BakerHostetler partners Amy Mudge and Daniel Kaufman. The Commissioner spoke on a lot of topics but specifically highlighted the recently announced 6(b) study the Commission is undertaking. Under its 6(b) authority, the FTC can conduct studies of consumer practices without a specific law enforcement purpose. In announcing this study, the Commission explained it is seeking information from seven companies about what steps, if any, the companies have taken to evaluate the safety of chatbots and to limit any potential negative effects on children and teens.
It is clear that AI is no passing phase but rather is here to stay. Nearly every single panel at the ANA Masters of Advertising Law Conference briefly mentioned AI. While we are still learning all the ways it can make our lives easier, states are eager to get into the mix and start regulating AI. New York and California are leading the way with bold laws aimed at protecting performers and informing consumers. At the same time, the federal government is taking aim at bad actors using AI to create and publish intimate imagery without consent. And the FTC is in the mix with its recently announced 6(b) study. If you were unable to attend ANA this year and hear about these AI developments and more, please consider attending our upcoming webinar on Nov. 20, where we will dive into these developments and more.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.