NO FAKES Here: A New Tool to Protect Against the Misuse of AI?

Faegre Drinker Biddle & Reath LLP
Contact

Faegre Drinker Biddle & Reath LLP

What do Scarlett Johansson, Drake, The Weeknd, and Taylor Swift have in common (besides being among this millennial’s fav celebs)?  They all have the distinct displeasure of becoming a target of deepfake technology – a type of AI that creates fake, but highly realistic-looking audio, images, or videos using the likeness and/or voice of its victim.

While there are certainly positive uses of deepfake technology (Val Kimer’s AI-generated voice in his reprisal of “Iceman” in Top Gun Maverick comes to mind),1 the widespread potential for abuse and malicious use, from fake musical performances, to fake political robocalls, to pornographic content, is top of mind for the entertainment industry, politicians, and ordinary folk alike.

But soon there may be a new tool to combat abuses of this technology: the “Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024” (the NO FAKES Act), introduced by the Senate in July and the House of Representatives just this week.

What Does the Bill Do?

The No FAKES Act gives an individual (or other rights holder) the exclusive right to authorize the use of his or her voice and or likeness in a digital replica – defined as “a newly-created, computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual.”

The act imposes liability on any person who, with knowledge that the material is a digital replica and not authorized, produces, publishes, reproduces, displays, distributes, transmits, or otherwise makes available a digital replica without consent.  The bill further creates a private right of action allowing a victim of a deepfake to bring a civil action to enforce his or her right, obtain injunctive relief, and recover damages, including $5,000 – $25,000/violation, punitive damages, and attorneys’ fees.

What Are the Exceptions?

There are numerous, arguably vague and ill-defined, exceptions intended to balance protection against deepfakes with First Amendment concerns, including carve-outs for:

  1. Use in “bona fide news, public affairs, or sports broadcast or account;”
  2. Use “in a documentary or in a historical or biographical manner” provided that use of the digital replica does not create the false impression that the replica is authentic;
  3. Use consistent with “the public in bona fide commentary, criticism, scholarship, satire, or parody;”
  4. Use that is “fleeting or negligible;” and
  5. Use in an advertisement or commercial announcement for purposes of promoting works under any of these exceptions.

These exceptions have sparked criticism, with some arguing that these carve-outs do not go far enough because they do not fully embody the concept of Fair Use, a First Amendment safeguard embodied in copyright law.2  If the bill passes, these exceptions (like the Fair Use doctrine) are bound to spur mountains of litigation.

Key Implications – Takedown Procedures

Because the No FAKES Act imposes liability on both the production and “the publication, reproduction, display, distribution, transmission of, or otherwise making available to the public,” online services that host user-uploaded material are at risk of being held liable under the act. But the act creates a safe harbor for these online services by establishing a notice and takedown framework.  An online service is exempt from liability under the draft act if they remove or disable access to the deepfake as soon as technically and practically feasible after receiving a notice of a violation.

Arguably, this is the most important aspect of the Act, as it incentivizes online services to take these notices seriously and act quickly to remove potentially harmful deepfake content and provides individuals with a mechanism to have content removed short of obtaining injunctive relief, which is often a protracted and expensive process.

It remains to be seen whether the bill will pass, but it’s certainly one to watch.


  1. https://www.washingtonpost.com/technology/2021/08/18/val-kilmer-ai-voice-cloning/
  2. See, e.g., Katherine Klosek, No Frauds, No Fakes . . . No Fair Use, ARL Views (Mar. 1, 2024), available at https://www.arl.org/blog/nofraudsnofakes/.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Faegre Drinker Biddle & Reath LLP

Written by:

Faegre Drinker Biddle & Reath LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Faegre Drinker Biddle & Reath LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide