Scarlett Johansson Accuses OpenAI of Using Voice Similar to Hers: Calls for Protection of Individual Rights
Scarlett Johansson is making headlines for taking a stand against the unauthorized use of her voice in AI technology. The actress accused tech company OpenAI of featuring a voice that closely resembled her own in their latest ChatGPT bot, despite her explicit refusal to lend her voice to the system. Johansson expressed shock and frustration at the situation, leading her to hire legal counsel to address the issue.
OpenAI has denied that the voice in question, named Sky, was intended to mimic Johansson’s voice. However, Johansson pointed out that the company’s CEO, Sam Altman, had contacted her in the past to ask for her voice for the system, which she declined for personal reasons.
In response to Johansson’s allegations, OpenAI announced that they would pause the use of the Sky voice while they addressed the concerns raised. The company stated that the voice belonged to a different actress using her own natural speaking voice, but they could not disclose the identity of the voice talent due to privacy concerns.
Johansson called for new laws to address issues of AI technology and individual rights, emphasizing the need for transparency and legislation to protect individual rights in the face of deepfakes and unauthorized use of likeness. Her statement was supported by SAG-AFTRA, the labor union representing actors and performers, who echoed her call for protections against unauthorized digital replication of voices and likenesses.
The use of AI in the entertainment industry has been a contentious issue, especially as technology advances and raises questions about privacy and consent. Johansson’s case highlights the importance of clarity and transparency in the development and use of AI technology, as well as the need for legal protections to safeguard individual rights in the digital age.