Overview of U.S. Copyright Office Report Regarding Artificial Intelligence and Digital Replicas

Venable LLP
Contact

Venable LLP

The U.S. Copyright Office published Part 1 of their report on copyright and artificial intelligence (AI), focusing on digital replicas. Digital replicas are "a video, image, or audio recording that has been digitally created or manipulated to realistically but falsely depict an individual." These are often created using artificial intelligence technologies to modify or generate content and can create increasingly realistic simulations of someone’s voice, appearance, or likeness. These replicas can be so convincing that they are difficult to distinguish from authentic depictions of the individual, leading to concerns about fraud, impersonation, and the creation of misinformation. However, the Copyright Office report focuses on deepfakes and protecting an individual’s ability to control and benefit from their voice, image, or likeness.

It is important to note that digital replicas can serve both beneficial and harmful purposes. Digital replicas can drive accessibility tools for individuals with limited speech and create authorized performances by deceased artists or other creative works. However, as the report notes, they can also be used to closely simulate a person’s voice, image, or likeness in misleading or fraudulent ways. These simulations are known as “deepfakes” that purport to show actions that a person has not taken or create content that seeks to exploit someone’s fame or reputation.

Overview of the Report

The Copyright Office has been working on a study examining copyright law and policy issues raised by AI, and the 2023 Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (EO 14110) called for the Copyright Office to provide recommendations for executive action around copyright and AI after concluding their study. This report is only the first of several topical reports. 

The report considers the legal and policy issues related to digital replicas created by AI, including existing state laws granting long-standing rights of privacy and publicity; states that existing laws do not provide adequate protection or redress; and seeks a law protecting all people and giving them the right to license their likeness (not just celebrities).

The report highlights three key areas of harm connected with the unauthorized use of digital replicas: sexually explicit deepfake imagery, the perpetuation of fraudulent activities, and the use of impossible-to-discern misinformation to undermine our political system and news platforms.

It also discusses existing state and federal laws, including the Copyright Act, the Federal Trade Commission Act, the Lanham Act, and the Communications Act, that limit the ability to create or use digital replicas in some circumstances. 

Digital replicas provide challenges in a number of areas, including intellectual property, consumer privacy protection, unfair competition, and fraud. Analyzing state law, the report identifies the right to privacy and right of publicity, which in many states are too narrow to cover all types of digital replica misuse. While some states such as Louisiana, New York, and Tennessee have passed laws to specifically target digital replicas, each of those states has different (and not uniform) rules and exemptions in this area.   

The report also points out that existing federal laws are deficient in fully protecting individuals from the potential harms of digital replicas. For example, under U.S. copyright law, one’s voice is not copyrightable because the sounds are not fixed in a tangible medium of expression. The report concludes that existing state and federal laws are insufficient to address fraud, harassment, reputational damage, emotional harm or distress, and potential loss of income created by digital replicas.

Proposed Right 

The report calls for action to address issues around digital replicas, including the creation of a “digital replica right.” This new federal right would protect all individuals, regardless of whether they are well known, during their lifetime and allow the licensing of digital replicas by the individuals depicted. It would cover both commercial and non-commercial uses, potentially addressing harms well beyond loss of income or ownership. 

The report calls for various remedies, including monetary damages, statutory damages, attorney fees, and injunctive relief. It also calls for a safe harbor for online service providers who remove infringing content based on a notice and takedown system, similar to Section 512 of DMCA.

The report clarifies that Section 114(b) of the Copyright Act, which allows imitation of sounds in sound recordings, does not conflict with state laws on unauthorized digital replicas. It states that concerns are “misplaced,” since Section 114(b) has a different policy aim and does not preempt state laws.

What Happens Next

Whether Congress takes up this call for a federal right around digital replicas remains to be seen. Still, significant congressional attention has been paid to digital replicas and deepfakes over the last year. The Nurture Originals, Foster Art, and Keep Entertainment Safe Act (the NO FAKES Act, S. 4875) was recently introduced in the Senate by Sens. Coons, Blackburn, Klobuchar, and Tillis to establish property rights around an individual’s voice and likeness, a step further than the proposal from the Copyright Office. Other proposals have emerged in the House, such as the No Artificial Intelligence Fake Replicas and Unauthorized Duplications Act of 2024 (NO AI FRAUD Act, HR 6943). Additionally, there are numerous proposals in state legislatures, including new and established laws in Tennessee, providing ownership rights over voice and likeness.

The report does not address some of the topics the AI EO asked for, including the use of copyrighted data in training data sets and the scope of copyright protection for AI-created works. We can expect additional reports examining those and other topics from the copyright study NOI, such as the potential liability for infringing works generated using AI systems. 

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Venable LLP

Written by:

Venable LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Venable LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide