Skip to content


What Can Scarlet Johansson Do About the Similarity of ChatGPT’s Sky voice to Her Own? 

Brett Trout

Background
According to Scarlet Johansson’s agent OpenAI, the owner of ChatGPT, was engaged in discussions with the Black Widow actress to use her voice as the voice of their latest AI project Sky. This may, or may not have been influenced by Johansson’s voice acting in the movie “Her,” where she played an increasingly sentient AI girlfriend. Johannsson eventually passed on the project and OpenAI hired a different actress to train their Sky project. The problem, according to Johansson’s team, is that the new Sky voice sounds like Johansson’s voice. Johansson’s team has now asked OpenAI to “slow down” the launch of Sky with the new voice.

Voice Misappropriation
What are your options if someone uses your voice without your permission? Whereas your freedom of speech is protected by the Constitution, your right of publicity is protected by a mix of different state laws. Depending on which state you bring your right of publicity lawsuit, you may have a lot or only a little protection. Even though state laws vary, some general tenets underlying the right of publicity emerge across these various state laws. One area of right of publicity protection covers your name, image, and likeness. Certain states, like California have specific statutes that allow you to sue anyone who knowingly uses another person’s voice to market their goods or services.  

Imitating Your Voice
What if instead of using your actual voice, someone uses an imitation of your voice? Since they are not using your voice, state statutes that require use of your actual voice no longer apply. So can you avoid liability by simply hiring an actor to imitate your favorite actor’s voice to sell your widgits? Possibly, but it depends. If someone uses an imitation of your voice to sell things in California, you have to be famous and have a distinctive voice to successfully sue for use of an imitation of your voice.

This was the case when the Ford Motor Company used an imitation of Bette Midler’s voice, in a song off of Midler’s album, to sell cars. In that case, the California court held that if you deliberately imitate a distinctive voice of a professional singer who is widely known, to sell your product, that singer may recover damages from you for appropriating their right in their own voice. 

This type of imitation is a species of violation of the “right of publicity,” in this case the right of a person whose identity has commercial value, typically a celebrity, to control the commercial use of their identity.  In California, if a voice is a sufficient identifier of a celebrity’s identity, this right of publicity provides a cause of action against anyone who would imitate the voice for commercial purposes without consent of the celebrity. 

Voice vs. Style
Even if you are famous with a distinctive voice, you only have rights in your voice, not your style of speaking. If someone imitates the prosody of your style (rhythm, speed, pitch, emphasis, etc.) without imitating your voice to the point someone would mistake it as your own, you do not, without more, have a cause of action against them. 

False Endorsement
In addition to claims of voice misappropriation, you may have a claim for false endorsement. Section 43(a) of the Lanham Act, 15 U.S.C. § 1125(a), prohibits the use of false designations of origin, false descriptions, and false representations in the advertising and sale of goods and services. The Lanham Act expressly prohibits the use of any symbol or device which is likely to deceive consumers as to the association, sponsorship, or approval of goods or services by another person. This was the case when singer Tom Waits sued Frito-Lay for imitating his distinctive voice in an admitted parody of a Tom Waits song to sell SalsaRio Doritis. Waits argued that Frito-Lay was misrepresenting to consumers that Waits endorsed SalsaRio Doritos. The question was whether “ordinary consumers . . . would be confused as to whether Tom Waits sang on the commercial . . . and whether he sponsors or endorses SalsaRio Doritos.” In this case the jury found that Frito-Lay had indeed falsely implied Waits’ endorsement of the chips in direct violation of the Lanham Act.  

Damages
So assuming you do successfully sue someone for using or imitating your voce, what kind of damages can you recover. If they take your actual voice in California, you are allowed to recover the greater of $750 or the total damages you sustained as a result of the use and any profits the other party made as a result of the use. To show profits, you only have to submit gross revenue numbers and the burden shifts to the person who used your voice to prove any deductible expenses. California’s statute also allows for punitive damages and attorney’s fees and costs.  

What if they just imitated your voice so the state statutory damages do not apply? In the Waits case, where they imitated his voice, Waits was awarded $375,000 in compensatory damages for imitating his voice ($100,000 for the fair market value of his services, $200,000 for injury to hipeace, happiness and feelings,and $75,000 for injury to his goodwill, professional standing and future publicity value), $2 million in punitive damages, $100,000 damages for violation of the Lanham Act, and attorney’s fees under the Lanham Act. The appellate court ultimately eliminated the $100,000 damages for violation of the Lanham Act as duplicative, but kept the attorney’s fees award intact. 

So is the award in the Waits case typical? No. The large punitive damage award was the result of Waits’ well-known strong stance against ever using his voice to sell products. He felt it would be selling out, undermining his artistic integrity. As a result, commercial use of his voice was particularly offensive to Waits, allowing the jury to award damages for mental distress. While merely taking offense is an insufficient basis for awarding mental distress damages, in California you may recover mental distress damages for shame, humiliation, embarrassment, and anger. 

Because Waits’ character and personality and image was that he did not endorse products, the Doritos commercial humiliated Waits by making him an apparent hypocrite. As a result, the jury was allowed to include in its damage award damages for injury to Waits’ goodwill and future publicity value (advertisers would likely pay him less if the believed he was a hypocritical “sell out”). 


Waits also recovered punitive damages of $1.5 million against the advertising firm that created the ad, Tracy-Locke, and $500,000 against Frito-Lay. In California, these damages are recoverable if the plaintiff can prove, by clear and convincing evidence, that the defendant was been guilty of oppression, fraud, or malice. Cal.Civ.Code § 3294(a)  Malice in this context, is despicable conduct carried on by the defendant with a willful and conscious disregard of the rights or safety of others.  § 3294(c)(1) 

In upholding the punitive damages award, the appellate court found that in going forward with the commercial using Waits’ imitated voice, the defendants knowingly undertook the risk, consciously disregarding the effect of these actions on Waits’ legally recognized rights. One of Waits’ star witnesses at the trial was the Waits’ impersonator used in the radio ad. The impersonator told the defendants before the ad aired that Waits had a policy against doing commercials and would not like having an imitation of his voice used in this manner. This damning testimony likely led to the large punitive damage awards.

Finally Waits recovered his attorney fees. Although the appellate struck the damages Waits recovered under the Lanham Act (as duplicative of his other recovery), the appellate court upheld the award of attorney’s fees under the Lanham Act. 


Conclusion
So how does the foregoing apply to the Johansson case? Given that OpenAI did not use Johansson’s voice, the California statute likely does not apply. The tougher question is whether OpenAI owes Johansson damages for imitating her voice. OpenAI’s position is that the actress it used to train Sky was not told to imitate Johansson and that the voice used was the actual speaking voice of the actress they hired. While Scarlett Johansson’s voice is much less distinctive than Bett Midler’s or Tom Waits’, it is debatable whether it lacks sufficient distinctiveness to merit protection? However, if OpenAI can show many people have a voice similar to Johansson’s, it may be able to avoid liability for imitating her voice. 

Even if OpenAI can avoid liability for imitating Johansson’s voice, it still may be liable under the Lanhan Act for deceiving customers into thinking Scarlett Johansson endorses the new Sky AI. If Johansson can prove false endorsement, she may be able to recover her losses, OpenAI’s profits, triple damages, and attorney’s fees. Unlike the Waits case, however, Johansson does not have a policy against using ads to endorse products. Indeed, Johansson was in talks with OpenAI to accept this exact job. As a result, the large damages recovered in the Waits case would likely not be available to Johansson. A more likely award would be in the neighborhood of what Johansson was demanding to accept this job during the negotiations. 

A trial on this dispute would certainly be interesting, especially the aspects pertaining to the use of someone’s likeness in AI. However, the damages are likely small enough and the optics of a trial loss bad enough for both parties that keeping this matter in the limelight a little longer for publicity’s sake, then quietly and confidentially settling the case will likely be in the best interests of both parties. 

Related posts

Posted in AI, Artificial Intelligence. Tagged with , , , , , .