- within Litigation and Mediation & Arbitration topic(s)
It has been reported that actors Matthew McConaughey and Sir Michael Caine have each signed a deal with AI audio company, ElevenLabs, allowing for the creation of AI-generated versions of their voices. These agreements follow on from a recent trend in Hollywood for celebrity partnerships with AI-generative tech companies, which now includes the likes of Dame Judi Dench and Kristen Bell who both partnered with Meta in 2024 for the use of their voices.
The “can’t beat ‘em, join ‘em” partnerships with AI companies will allow McConaughey and Caine to reap financial rewards for AI-generated imitations of their famous voices. This proactive approach means they have some control over the use of “their” voice. However without such a licence, how can actors (or anyone for that matter), control the use of aspects of their personality and voice by AI?
A few years prior to his death, actor Val Kilmer had signed on to appear in a movie “As Deep as the Grave”. The film is now going ahead, and it is reported that Kilmer’s estate has agreed to an AI-generated Val Kilmer appearing in the movie. His estate is being compensated for his “appearance”, and this is at least a project that the estate knew he was interested in. But it raises interesting questions as to the where the boundaries are for AI-generated actors being used in films.
Australia
Copyright law in Australia does not provide protection for an individual’s voice, unless it is an original recording. This means there is no protection available under the Copyright Act 1968 (Cth) for unauthorised “soundalikes” or recreations of personal voice by AI models.
Instead, where there is an unauthorised use of an individual’s voice (or a voice that sounds like someone), individuals must resort to bringing an action for the common law tort of passing off, or a breach of the Australian Consumer Law (ACL) (Schedule 2, Competition and Consumer Act 2010 (Cth)), for misleading and deceptive conduct, each of which have their challenges.
For an action in passing off, the challenge of protection for voice imitations arises as the tort is only made by establishing the ‘classical trinity’ of elements, reputation, misrepresentation and injury or damage to goodwill. For misleading and deceptive conduct, reputation is also important to be able to establish that someone has been, or is likely to be, misled or deceived.
While the likes of Matthew McConaughey and Michael Caine may have no difficulty establishing their reputation, for the average Australian whose voice is not instantly recognisable, this first hurdle will likely be difficult.
Australia’s lack of copyright (or other more specific) protections of AI-voice-imitations places Australia in contrast to other jurisdictions which offer more streamlined protections for unauthorised use of a person’s natural voice.
US
Although McConaughey’s home of the US does not grant copyright protection to a person’s voice, many states including New York and California recognise ‘publicity rights’, which reflect the commercial value of personal traits, including name, likeness and voice.
The ability to infringe “voice” rights seemingly informed Disney’s new partnership and investment with OpenAI’s, Sora. While the mega-deal between the two companies allows for the licence of all Disney characters to be used on the AI platform, the specific carve out for the use of character’s voices was perhaps to protect against claims by the actors’ who provide those voices.
China
China has also recently ruled on two landmark cases in AI-voice copyright infringement, each of which held that unauthorised use of celebrity voices was in breach of ‘personality rights’ contained in natural voice protections.1
India
India has also gone the path of personality rights protection of a person’s voice, with Bollywood singer and actor Arijit Singh succeeding in a personality-rights claim against AI cloning of the singer’s voice.2
Trade Mark and voices
Another avenue yet to be fully explored is the use of the trade mark system to protect against voice likeness. On top of the actor’s licencing deals, McConaughey was recently successful in protecting his famous ‘Alright, Alright, Alright’ catch-phrase, with the sensory mark recently achieving registration on the United States Patent and Trademark Office database, presumably another example of the actor attempting to protect from AI misuse of his voice and catchphrase.
In the current Australian climate, trade mark registration could offer a useful means of protecting particular phrases, though it won’t extend to protection of the general sound of someone’s voice.
There are certainly uncharted waters with AI voice imitation and legal protection. Despite Australia’s lack of specific legal protection to AI-generated voice copies, businesses that seek to imitate an individual’s voice using Generative-AI, nevertheless do so at their own risk of adverse consequences.
To mitigate such risk, it is important when creating AI imitations of a person’s voice (or other characteristics) to seek consent of the individual to that use, and ideally to have an appropriate licence in place. The use of AI-generated likeness to an individual will continue to be an evolving space to watch around the world.
Footnotes
1. (2023) Jing 0491 Min Chu No. 12142
2. Arijit Singh v. Codible Ventures LLP
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.