ARTICLE
12 September 2024

U.S. Copyright Office Calls For A New Digital Replica Law

In today's digital age, artificial intelligence (AI) has rapidly advanced, bringing with it unprecedented opportunities — and challenges.
United States Intellectual Property

In today's digital age, artificial intelligence (AI) has rapidly advanced, bringing with it unprecedented opportunities — and challenges. One of the most pressing issues is the rise of AI-generated digital replicas, commonly referred to as “deepfakes.”

These realistic but fabricated images, videos, and audio recordings can mimic individuals with alarming accuracy, raising serious concerns about privacy, identity, and the potential for misuse. (No, that wasn't Taylor Swift trying to give you a free set of Le Creuset kitchenware, and Snoop Dogg isn't offering advice on a psychic hotline.)

On July 31, 2024, the United States Copyright Office took a significant step by releasing a report that calls for new federal legislation to protect individuals against unauthorized deepfakes. This report is the first in a planned series exploring the intersection of AI and copyright law, following the Copyright Office's Notice of Inquiry (NOI) issued in August 2023. The NOI sought public input on various issues related to AI, including the unauthorized use of a person's voice, image, and likeness.

Understanding digital replicas

The Copyright Office defines a digital replica as any video, image, or audio recording that has been digitally manipulated to falsely depict an individual. While these replicas are often associated with harmful uses — such as fraud or creating misleading content — they also have potential benefits. For instance, individuals can license their likenesses for commercial use, or digital replicas can assist those with physical disabilities by allowing them to create or control content in ways that might not otherwise be possible.

The Copyright Office's report underscores that the current legal framework does not adequately protect individuals from the risks posed by unauthorized digital replicas. Federal and state laws offer some protection, particularly concerning privacy, intellectual property, and consumer rights. However, these laws are inconsistent across jurisdictions, and often insufficient, particularly when it comes to non-commercial use of deepfakes.

The case for a new federal law covering deepfakes

The Copyright Office's report makes a strong case for the creation of a new federal law specifically designed to address the challenges posed by deepfakes. The report offers several key recommendations:

  1. Scope of protection: The proposed law should focus on digital replicas that convincingly appear to be the actual individual being depicted rather than broadly covering all uses of a person's name, image, or likeness.
  2. Who is protected: Unlike some existing laws that only protect public figures or celebrities, the new law should extend protection to everyone, acknowledging that deepfakes can harm anyone, regardless of their public status.
  3. Duration of protection: The report discusses whether protections should extend beyond a person's lifetime, suggesting that if postmortem rights are included, they should be limited to a specific term, such as 20 years.
  4. Infringement and liability: Creating a digital replica should not automatically incur liability. Instead, liability should arise when a deepfake is distributed with knowledge that it is unauthorized. The report also emphasizes that the law should apply to both commercial and non-commercial uses of digital replicas.
  5. Licensing and assignment: The Copyright Office suggests that individuals should not be allowed to permanently assign away their rights to digital replicas. Instead, these rights should be subject to limited-term licensing agreements, ensuring that control over one's likeness remains protected.
  6. Balancing First Amendment rights: The report recognizes the delicate balance between prohibiting harmful uses of digital replicas and protecting free speech. It proposes that any new law should include a balancing test to address this tension, allowing for flexibility as the technology and its implications continue to evolve.
  7. Enforcement: To deter the misuse of deepfakes, the report suggests that a new law should provide for injunctive relief, monetary damages, and possibly statutory damages. The law should also include a notice-and-takedown mechanism, similar to the Digital Millennium Copyright Act (DMCA), to enable swift action against unauthorized digital replicas.

Looking ahead toward new legislation

The report from the Copyright Office is a critical first step toward comprehensive federal regulation of AI-generated digital replicas. It lays the groundwork for future legislation that could significantly impact how digital replicas are created, used, and regulated. Moreover, the report is expected to influence ongoing discussions in Congress, where several proposals for digital replica laws, including the No Artificial Intelligence Fake Replicas And Unauthorized Duplications (No AI FRAUD) Act and the NO FAKES Act of 2023, are already under consideration.

As AI technology continues to evolve, so too will the legal landscape. Staying aware of these developments is essential. The issues raised by deepfakes and digital replicas will likely play an increasingly important role in various areas of law, including privacy, intellectual property, and digital rights.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More