That Robot Took My Voice: Scarlett Johansson And The Right Of Publicity

Dunlap Bennett & Ludwig PLLC
Contact

“A voice is as distinctive and personal as a face. The human voice is one of the most palpable ways identity is manifested. We are all aware that a friend is at once known by a few words on the phone. . . The singer manifests herself in the song. To impersonate her voice is to pirate her identity.”[1]

Of all the early 21st century woes, the specter of Artificial Intelligence replacing human labor is quite possibly one of the most disturbing. We worry that AI will take our jobs and replace us in the workforce. We imagine ourselves in the position of John Henry racing against the machine. But rarely do we consider the possibility that AI could take something so personal as our identities.

This startling idea was recently brought to the forefront by a dispute between Scarlett Johansson and OpenAI. In May of 2024, OpenAI announced its new voice for ChatGPT named Sky. The problem, however, was that Sky’s voice bore a shocking resemblance to Scarlett Johansson’s own voice. Johansson had been approached by OpenAI earlier in the year to provide a voice to the AI but declined. Nevertheless, the announcement of Sky was complete with a voice right out of Her – a 2013 film where Johansson voiced an AI. Worse, the resemblance seemed intentional given not only the fact that OpenAI had approached Johansson to voice Sky, but also because on May 13, 2024 the CEO of OpenAI, Sam Altman, tweeted the single word “her.”

Although OpenAI has announced that it has paused the use of the Sky voice and insisted that Sky’s voice actually came from a different, unnamed actor, this story raises serious questions over the use of AI to imitate someone’s likeness.

How is this Illegal?

One may ask, if OpenAI used a different actress to create Sky’s voice and did not really train Sky based on Johansson’s voice, how is this an issue?

The answer comes in the form of a privacy tort commonly referred to as the right of publicity. This legal claim is a creature of state law designed to give individuals the right to protect the exploitation of their identity – such as their name, image, or likeness. Not all states have right of publicity statutes on the books, but many arrive at the same conclusion by allowing for common law claims to protect against the unauthorized use of a person’s likeness.

Does the Use of a Different Actor Save OpenAI?

The answer, like the answer to so many legal questions, is not definite. But there is certainly precedent for holding someone liable for using a sound-a-like voice actor to impersonate a well-known figure.

In fact, this issue came up even before the age of AI. In Midler v. Ford Motor Co.,[2] singer Bette Midler sued Ford Motor Company for allegedly stealing her voice to be used in a commercial. Midler had declined to sing in the commercial, but Ford nonetheless used a sound-a-like voice actor to recreate her voice, telling the actor to “sound as much as possible like . . . Bette Midler.”[3] When personal friends started telling Midler that they thought it was her singing in the commercial, Midler brought suit against Ford. The 9th Circuit Court of Appeals held that “when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California.”[4]

Does AI Change Anything?

At the end of the day, stealing someone’s voice or likeness is nothing new. AI has just made it easier.

The most likely effect of the use of AI to create this sound-a-like voice is that it has garnered national attention and may motivate change at a federal level. There has been a growing swell of publicity related to deep fakes and AI imitations such as when the estate of comedian George Carlin sued podcasters over their AI imitation of his voice and stand-up.

As mentioned above, privacy torts of this kind currently stem from state law. However, a bipartisan bill has recently been introduced in the Senate – the “Nurture Originals, Foster Art, and Keep Entertainment Safe Act”, or, simply, the NO FAKES Act. The bill would give individuals a property right in their image, voice, or visual likeness in a digital replica and provide a federal right of action for when this right is violated.

It is unclear just where this bill will end up at this time. The drafters will need to be careful to account for First Amendment concerns and more innocent uses of digital replicas such as the fan made AI version of comedian, actor, and podcaster Henry Zebrowski singing Frank Sinatra’s “That’s Life.” Moreover, the bill will need to account for any interactions with federal Copyright law. If nothing else, the NO FAKES Act is certainly indicative of a renewed interest in protecting individuals’ likenesses in the age of AI.

[1] Midler v. Ford Motor Co., 849 F.2d 460, 463 (9th Cir. 1988).

[2] Id.

[3] Id. at 461

[4] Id. at 463

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Dunlap Bennett & Ludwig PLLC | Attorney Advertising

Written by:

Dunlap Bennett & Ludwig PLLC
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Dunlap Bennett & Ludwig PLLC on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide