Replica Studios is a company that specializes in voice and speech technology, providing AI voice synthesis for various applications. The technology can generate and integrate lifelike voices and dialogue for characters in video games, virtual assistants, and other interactive elements in digital media. AI based voice generation involves machine learning algorithms using a substantial amount of audio data and recordings from various human speakers, covering a wide range of accents, languages and speaking styles. Features such as tone, pitch, rhythm, and pronunciation are extracted from human individuals for the AI model to understand and mimic the nuances of human speech.
This collaboration between SAG-AFTRA and Replica Studios instantly garnered negative feedback amongst voice actors and other performers. In its statement, SAG-AFTRA said that this deal with Replica had been “approved by the affected members of the union’s voiceover performer community.” However, prominent performers, including Steve Blum, once credited by Guinness as the most prolific voice actor in video games (and also the voice of Spike Speigel in Cowboy Bebop), Chris Hackney (voice of Dimitri in Fire Emblem Three Houses and Ayato in Genshin Impact, amongst other projects), Sunil Malhotra (Fallout and Mortal Kombat voice actor) and many others have pointed out that as affected members, they received no notice or information regarding this deal until the public announcement made by SAG-AFTRA.
Given that AI encroachment is ongoing and seemingly inevitable, this deal between SAG-AFTRA and Replica Studios may help to establish parameters in a landscape where AI platforms have been using unethical data scraping without permission from the human performers. Rather than refusing the use of AI, it may be wiser to consider the legal questions and concerns about AI’s potential impact on actors’ rights, privacy, and the broader ethical landscape. This agreement, if handled correctly, can establish protections for performers while considering integration of AI in performers’ works. Here are a few specific terms to look for once the agreement is released:
- Rights and Compensation for Actors – The agreement needs to contain clear contractual terms regarding fair compensation for actors for the use of their voices for purposes of AI machine learning. There needs to be consideration given to the performers based on length and duration of the use of their voice data, and whether it’s for a single purpose or project, or to be used across a series of works.
- Privacy Issues – The use of AI technology to replicate voices will involve collecting and storing extensive voice data for training purposes. Replica Studios needs to provide details of their security measures and address any potential privacy issues related to the voice data storage and usage.
- Intellectual Property Challenges – The agreement will need to address ownership of the digital replicas. Who will own the rights to the replicated voices – the actors, the developers who purchase the voice data, or Replica Studios as the AI technology provider?
Existing law protects an individual’s name, likeness, and other recognizable aspects of a person’s persona, including voices (the Right of Publicity), and gives an individual the exclusive right to license the use of their identity for commercial purposes. A prominent case relevant to the copying of a person’s voice is Midler vs. Ford Motor Co., where world renowned singer Bette Midler sued Ford Motor Company over a commercial using a former Midler back-up singer to imitate Midler’s voice after Midler declined to participate in the commercial. The US Court of Appeals for the Ninth Circuit held that an individual’s “voice is as distinctive and personal as their face,” and found that Ford’s act of using an impersonator for Midler’s voice was the equivalent of taking her identity for commercial use.
Although the AI voice replicas here would be used in connection with video games and might be considered an artistic endeavor rather than pure commercial use (for the sale of goods or services), the best protection for performers would be if the SAG-AFTRA and Replica Studios agreement provided performers with ownership of their AI voice replicas, and allow performers to determine the scope and type of projects they are willing to allow their AI voice replicas to participate in.
- Withdrawal of Consent – The agreement needs to include provisions that allow actors the ability to withdraw consent and Replica Studios will need to provide verifiable assurances that once consent is withdrawn, the performer’s digital data will be destroyed and no longer be accessible for the creation of digital voice replicas. There should be a minimum level of standardized penalties for any uses without express consent.
Ideally, the agreement should establish baseline requirements for video game studios to use any AI digital voice replicas, but leave room for each individual performer to negotiate better or stronger terms. There is ongoing fear that AI will erode the value of performers’ crafts, and justifiably so. However, if the agreement provides clear and comprehensive terms, robust privacy measures, and fair compensation structures, it may establish a good basis for navigating this uncharted territory of AI voice replicas in gaming and in entertainment as a whole.