The NO FAKES Act: Bipartisan Group of Senators Introduce Bill to Protect against Unauthorized Uses of Digital Replicas

McDonnell Boehnen Hulbert & Berghoff LLP

After floating a discussion draft last fall, a bipartisan group of Senators formally introduced the Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024 ("the NO FAKES Act" or "the Act") on July 31, 2024. The Act is remarkable not only because its sponsors span the ideological spectrum -- the Act was introduced by Senators Coons, Blackburn, Klobuchar, and Tillis -- but also because it has received the backing of groups such as the Motion Picture Association, the Recording Industry Association of America, the Independent Film & Television Alliance, as well as the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA) and major artist representation agencies.[1] In a Congress that is more often than not too polarized to actually legislate, the bipartisan nature and widespread support for the Act may indicate that it has a fighting chance of becoming a law.

To date, any regulation of the use of individual's names, images, or likenesses has been governed by copyright law, the Lanham Act, or state statutes and common law. Those laws have generally been sufficient to protect individuals' rights, but there is a mounting concern that generative AI replications fall through the cracks of existing law. For example, the song "Heart On My Sleeve" -- purportedly a collaboration between musicians Drake and The Weeknd -- went viral last year on social media and streaming platforms.[2] Earlier this year, OpenAI quickly shuttered a digital voice assistant that sounded remarkably similar to actress Scarlett Johansson.[3] A recent lawsuit alleges that an AI company used deceptive practices to obtain samples from voice actors, which were later used to produce content that was not actually spoken by the actors.[4] Even more recently, voice actors and motion capture artists in the video game industry voted to strike over potentially exploitative uses of their images and likenesses by AI.[5] More generally, the public is slowly becoming aware of menacing uses of "deepfakes" -- highly-realistic, AI-enabled audio, image, or video representations of individuals appearing to be saying or doing something they never actually did.

The NO FAKES Act is an attempt to cure those deficiencies by providing civil remedies that protect people against misuse of their images and voices. In particular, the Act creates a federal cause of action through which actors, artists, creators, and other individuals can seek relief for certain types of uses of their digital replicas. A digital replica is defined by the Act as "a newly-created, computer-generated highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual." Critically, the digital replica must be:

embodied in a sound recording, image, audiovisual work, including an audio-visual work that does not have any accompanying sounds, or transmission (i) in which the actual individual did not actually perform or appear; or (ii) that is a version of a sound recording, image, or audiovisual work in which the actual individual did perform or appear, in which the fundamental character of the performance or appearance has been materially altered.

Of course, the NO FAKES Act creates liability only for unauthorized use of the digital replica; it therefore includes provisions related to identifying the holders of rights (that is, the individual or "any other person that has acquired, through a license, inheritance, or otherwise, the right to authorize the use of such voice or visual likeness in a digital replica").

The Act's foundation is the precept that "each individual or right holder shall have the right to authorize the use of the voice or visual likeness of the individual in a digital replica." The right is "(I) a property right; (II) not assignable during the life of the individual; and (III) licensable, in whole or in part, exclusively or non-exclusively, by the right holder." This right continues past the death of the individual whose voice or visual likeness is at issue, with it becoming transferrable to heirs or assignable to other parties. One who acquires a postmortem right in this fashion must periodically (first every ten years but after that every five years) demonstrate public use of the voice or visual likeness in order to prevent the right from expiring. Regardless of public use and renewals, the right will expire 70 years after the death of the individual, aligning the expiration with expiration of rights under the Copyright Act.

Although the right of publicity is most analogous to privacy or trademark rights, the NO FAKES Act aligns more closely with other federal laws, including the Copyright Act and the Communications Decency Act (CDA), rather than the Lanham Act. The Act also leverages the existing administrative framework established by the Copyright Act, instructing the Copyright Office to set forth a procedure for a rights holder to carry out the renewals and to maintain an online directory of current post-mortem digital replication rights and a directory of representatives of websites and other potential hosts of media incorporating digital replicas (which allows protections analogous to Section 230 of the CDA).

The Act creates a civil cause of action that can be brought based on: "(A) The production of a digital replica without consent of the applicable right holder[, or] (B) The publication, reproduction, display, distribution, transmission of, or otherwise making available to the public, a digital replica without consent of the applicable right holder." Like the Copyright Act (but unlike the Lanham Act), the liability requires that a person engaging in either activity must have actual knowledge of their unauthorized use of a digital replica or has willfully avoided having such knowledge. Also like the Copyright Act but unlike the Lanham Act, a three-year statute of limitations applies starting from the date when "the party seeking to bring the civil action discovered, or with due diligence should have discovered, the applicable violation."

The remedies also align most closely to copyright law, allowing a plaintiff to receive statutory damages or actual damages (including both the harm to the individual and disgorgement of the defendant's profits). Specifically, the remedies available are the greater of: (a) $5,000 per work embodying the applicable unauthorized digital replica, $5,000 per violation by an entity that is an online service, and $25,000 per work embodying the applicable unauthorized digital replica by an entity that is not an online service, or (b) "any actual damages suffered by the injured party as a result of the activity, plus any profits from the unauthorized use that are attributable to such use and are not taken into account in computing the actual damages." The Act makes it clear that a plaintiff can seek injunctive or other equitable relief, and that punitive damages in the case of willful violations can include attorney's fees.

Analogous to fair use defenses in copyright law, the Act carves out certain exclusions to ensure that the Act does not conflict with the First Amendment. For example, the NO FAKES Act does not protect against the use a digital replica (even without authorization) in bona fide news accounts, documentary, historical, or biographical uses, or commentary, criticism, scholarship, satire, or parody. Further exclusions include de minimis uses as well as advertising or commercial uses in conjunction with any of the aforementioned exceptions. However, the Act makes it clear that the exclusions do not apply -- that is, it creates an exclusion to the exclusions -- "where the applicable digital replica is used to depict sexually explicit conduct."

Likely in order to attain support of AI, tech, and media companies, the Act sets forth a number of safe harbors similar to the Communications Decency Act. For products and services capable of producing digital replicas, there is no secondary liability unless those products and services are designed primarily to produce unauthorized digital replicas with limited commercial purpose beyond that.[6] For online services, referring or linking to an unauthorized digital replica also does not incur liability. As a quid pro quo, online services hosting user uploaded material have a takedown obligation upon receiving notice of violations of the Act. That is, they will not be liable for violating any rights if the online service:

(i) removes, or disables access to, all instances of the material (or an activity using the material) that is claimed to be an unauthorized digital replica as soon as is technically and practically feasible for that online service; and

(ii) having done so, takes reasonable steps to promptly notify the third-party that provided the material that the online service has removed or disabled access to the material.

However, these safe harbors are not available unless the online service designates, with the Copyright Office, an agent to receive notice of such violations. The designation must include at least the name, address, telephone number, and email address of the agent. The Copyright Office must maintain an online public registry of such agents.

All of those provisions are generally not very contentious. There is controversy, however, over whether the state law causes of action should be preempted and the balance between clarity and uniformity of the law and allowing States to provide greater protection.[7] The Act explicitly preempts any state laws that protect "an individual's voice and visual likeness rights in connection with a digital replica, as defined in this Act, in an expressive work." But there are exceptions to this preemption including pre-existing state laws (defined as "statutes or common law in existence as of January 2, 2025, regarding a digital replica"), state laws "regulating a digital replica depicting sexually explicit conduct," or "causes of action under State statutes or common law for the manufacturing, importing, offering to the public, providing, making available, or otherwise distributing a product or service capable of producing 1 or more digital replicas." In other words, state right of publicity and privacy laws protecting image and likeness rights in general would not be preempted but any specifically addressing the protection of digital replicas would be nullified if enacted after January 2, 2025.

The NO FAKES Act appears to be necessary because tech companies producing generative AI tools or providing them as services seem unwilling or unable to effectively self-regulate how they exploit the labors of others to train their models. Copyright law is unlikely to provide sufficient protection for the training of a generative AI model with the voices and visual likenesses of performing artists, even if those works are copyrighted. Further, it is not clear how similar the output of such models needs to be when compared to the original work for there to be copyright infringement.

If passed, the NO FAKES Act will plug a significant part of this gap. How a generative AI model is trained, or even if generative AI is used at all, does not matter. The digital replica need only be "computer-generated" and "readily identifiable as the voice or visual likeness of an individual." In doing so, the Act would be the first federal law in the U.S. that protects publicity rights.

Deepfakes, by their very nature, undermine public trust and can cause significant harm by distorting perceptions of reality and spreading misinformation. The appeal of this legislation lies in its commitment to limit the malicious misuse of technology, yet allowing authorized uses. However, while the bill has garnered bipartisan support, it will require a concerted effort and extensive backing in both the Senate and the House to pass into law, not to mention the support and signature of the president (whoever that might be if and when the Act is passed by Congress).

Even if the NO FAKES Act is enacted, there remains a question as to whether it is constitutional.[8] Unlike patents and copyrights, Article I of the Constitution provides no explicit basis for a federal right of privacy or publicity. Rather, the cause of action created by the Act is limited to violations that affect interstate commerce or use means of facilities of interstate commerce, tying the Act to the commerce clause. But there remains a tension with the Act's express statement that it is establishing a new property right in name, image, or likeness. So, ultimately, the courts may have the last word on whether constitutionality should be judged based on the nature of the right or the scope of the remedy.

[1] https://www.coons.senate.gov/news/press-releases/senators-coons-blackburn-klobuchar-tillis-introduce-bill-to-protect-individuals-voices-and-likenesses-from-ai-generated-replicas. Notably absent from this list are Microsoft, Alphabet, Meta, xAI (affiliated with Twitter/X), Apple, and several other AI tech players.

[2] https://en.wikipedia.org/wiki/Heart_on_My_Sleeve_(Ghostwriter977_song).

[3] https://www.patentdocs.org/2024/05/black-widow-versus-openai-and-what-it-means-for-singers-and-voice-actors.html.

[4] https://www.cbsnews.com/news/two-voice-actors-sue-ai-company-lovo/.

[5] https://www.mbhb.com/intelligence/snippets/interplay-between-image-and-likeness-rights-and-ai-central-to-gaming-actor-strike-and-newly-proposed-legislation/.

[6] This provision was likely included to protect the developers of generative AI large language models, such as OpenAI, Microsoft, and Google.

[7] For example, at an August 5, 2024 USPTO "Public roundtable on AI protections for use of an individual's name, image, and likeness," the representative of the Motion Picture Association spoke in favor of complete preemption and was immediately followed by the representative of the Recording Industry of America who spoke against preemption.

[8] This issue was raised by one of the speakers during the virtual portion of the USPTO roundtable, but it came too late for most other speakers to comment.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© McDonnell Boehnen Hulbert & Berghoff LLP

Written by:

McDonnell Boehnen Hulbert & Berghoff LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

McDonnell Boehnen Hulbert & Berghoff LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide