From Scarlett Johansson to Tupac: AI is Sparking a Performer Rights Revolution

Benesch
Contact

Benesch

With artificial intelligence (AI) taking the world by storm and generative AI making content creation easier than ever, legal problems regarding intellectual property and rights to publicity have inevitably started popping up, most notably in Hollywood regarding Scarlett Johansson and the late rapper Tupac Shakur.

Earlier this year, OpenAI debuted new ChatGPT voices and one such voice, Sky, sounded eerily similar to Scarlett Johansson’s voice from the 2013 movie “Her.” Johansson has not yet pursued legal action against OpenAI, but her legal team sent letters to the AI developer asking how they picked voices and soon after, OpenAI dropped the Sky voice “out of respect for Ms. Johansson.” OpenAI denied any connection between the Sky voice and Johansson.

Similarly, amidst the rap battle between Drake and Kendrick Lamar this past April, Drake released a song with a verse featuring the late Tupac Shakur using artificial intelligence. Tupac’s estate immediately sent a cease-and-desist letter to Drake demanding the song be taken down, which he complied with.

These two examples of unauthorized voice cloning are not isolated instances. On May 16, 2024, a class of voice-over actors (including several members of the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA)) filed a lawsuit in the Southern District of New York against LOVO, Inc., an AI voice generator and text to speech platform. The complaint alleges that LOVO stole their voices and identities without permission or compensation and states eight causes of action: (i) violation of New York Civil Rights Law Section 50 and 51; (ii) deceptive acts and practices in violation of the New York Deceptive Practices Act, N.Y. GBL section 349; (iii) false advertising in violation of the New York False Advertising Act, N.Y. GBL section 350; (iv) unfair competition and false affiliation in violation of section 43 of the Lanham Act, 15 U.S.C. section 1125(a); (v) false advertising in violation of section 43 of the Lanham Act, 15 U.S.C. section 1125(a); (vi) unjust enrichment; (vii) tortious interference with advantageous business relationship; and (viii) fraud. Plaintiffs are seeking compensatory and punitive damages exceeding $5 million and injunctive relief to prevent further misuse of voices, as well as recovery of profits from the scheme.

The two named plaintiffs, Paul Lehrman and Linnea Sage, assert that clients on the platform Fiverr approached them to provide voice-over scripts to research speech synthesis. Plaintiffs were assured that their voices were to be used for “research purposes only.” Years later, the Plaintiffs learned that their voice-over samples had been used to train LOVO’s AI Generator, to promote LOVO services, and market LOVO voices based on Plaintiffs’ voices. LOVO did this without Plaintiffs’ permission or compensation.

The lawsuit is still in the early stages of litigation and LOVO has yet to submit an answer to the complaint.

Although this is a very new area of litigation with no federal laws regarding rights of publicity, one such case Plaintiffs could rely on is Waits v. Frito Lay. There, the Ninth Circuit ruled that the right to publicity protects against imitation of an actor’s voice for commercial purposes without the actor’s consent “when a voice is sufficient indicia of a celebrity’s identity.” The court further clarified that for a voice to be misappropriated it must be distinctive, widely known, and deliberately misappropriated for commercial purposes.

A similar lawsuit was filed earlier this year in Los Angeles against the media company Dudsey for using generative AI to impersonate the late comedian George Carlin’s voice and comedic style to create an unauthorized comedy special posted on YouTube. In the complaint, the estate of George Carlin alleged three causes of actions: (i) deprivation of the right to publicity under Cal. Civ. Code section 3344.1; (ii) violation of publicity (common law); and (iii) copyright infringement. The case settled  in April with the parties agreeing to a permanent injunction barring further use of the video and use of Carlin’s voice or likeness.

Recent California Legislation

Given the growing capabilities of AI to produce realistic digital replicas of an individual’s likeness voice, or bodily movement, there are concerns that this technology could be used to create content without a performer’s knowledge or consent.

California lawmakers with the support of SAG-AFTRA and the California Labor Federation seek to provide greater protections to performers and their heirs by enacting new legislation to regulate the use of AI.

California Assembly Bill 2602

Many performers across the entertainment industry have inadvertently signed away their rights to their digital selves through clauses buried in performance agreements. Under these agreements, performers, for example, are authorizing studios to use their voice and likeness “in any and all media and by all technologies and processes now known or hereafter developed throughout the universe and in perpetuity.”

To address this issue, in February 2024, California Assemblymember Ash Kalra introduced Assembly Bill 2602, which seeks to regulate the use of generative artificial intelligence in performance agreements in the entertainment industry.

The bill would make a digital replica provision in a performance agreement unenforceable if:

  • The provision allows for the creation and use of a digital replica of the individual’s voice or likeness in place of work the individual would have otherwise performed in person;
  • The provision does not include a reasonably specific description of the intended uses of the digital replica, provided failure to include a reasonably specific description of the intended uses of a digital replica would not render the provision unenforceable if the uses are consistent with the terms of the contract and the fundamental character of the photography or soundtrack as recorded or performed; and
  • The individual was not represented by legal counsel or a labor union representative in connection with the negotiation of  the use of the individual’s digital replica.

The bill intends to prevent situations where there is an imbalance of power in negotiations and ensure performers fully understand the risks before transferring their rights to their digital likenesses.

California Assembly Bill 1836

Under California law, heirs of a deceased celebrity have the right of publicity, making it a tort to use a celebrity’s name, voice, signature, photograph or likeness for unauthorized commercial purposes within 70 years of the celebrity’s death. However, excluded from this right are “expressive works” like plays, books, magazines, newspapers, musical compositions, audiovisuals, radio or television programming, works of art, works of political or newsworthy value, and more. With the rise of AI, digital replicas of deceased celebrities have been created without the consent or compensation to their heirs.

To address this issue, in January 2024, California Assemblymember Rebecca Bauer-Kahan introduced Assembly Bill 1836, which seeks to provide greater protections to heirs of deceased celebrities by narrowing the scope of the “expressive works” exemption and establishing specific causes of action against unauthorized uses of digital replicas in audiovisual works and sound recordings.

The bill would make a person who produces, distributes, or makes available a digital replica of a deceased celebrity’s voice or likeness in an audiovisual work or sound recording without prior consent liable to any injured party in an amount equal to the greater of $10,000 or the actual damages suffered by the person who controls the rights to the deceased celebrity’s likeness.

The bill intends to give the beneficiaries of the deceased celebrity greater control over the celebrity’s likeness.

Hollywood Has Also Made Strides In Regulating AI Within the Film and Music Industry

After six long months of protesting, studios eventually came to an agreement with SAG-AFTRA that requires producers to clearly stipulate how they plan on using AI in a contract. Producers must also obtain informed and explicit consent from an actor or singer before using their digital replica or voice.

Further, in April SAG-AFTRA reached a deal with several major record labels including Warner Music Group, Sony Music Entertainment, Universal Music Group, and Disney Music Group that includes protections against the use of AI. The contract demands “clear and conspicuous” consent and codifies the terms “artist, “singer,” and “royalty artist” to only include humans, and companies must give compensation for use of a digital replica. Sony Music Group additionally declared they would opt out of any training, development, or commercialization of AI systems.

Conclusion

As the technology of modern society continues to advance, so do the legal issues that follow. The recent attempts to implement protections  for performers in the entertainment industry demonstrates that the rise of generative AI is not absolved from the ramifications of intellectual property law and the right of publicity, and that lines must be drawn.

With contribution from Summer Associate, Quincie Cross. 

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Benesch | Attorney Advertising

Written by:

Benesch
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Benesch on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide