Hollywood Fights Voice Cloning with No AI Fraud Act
By Movieguide® Contributor
At present, there’s no federal law that prohibits the unauthorized use of someone’s voice or likeness. That may change soon with the introduction of the No AI Fraud Act.
“A bipartisan coalition of House lawmakers have introduced a long-awaited bill to prohibit the publication and distribution of unauthorized digital replicas, including deepfakes and voice clones,” the Hollywood Reporter wrote on Wednesday.
“The legislation, proposed on Wednesday, is intended to give individuals the exclusive right to greenlight the use of their image, voice and visual likeness by conferring intellectual property rights at the federal level. Under the bill, unauthorized uses would be subject to stiff penalties and lawsuits would be able to be brought by any person or group whose exclusive rights were impacted,” said the Hollywood Reporter.
Hollywood trade groups and unions have been lobbying to regulate AI voice replicating. SAG-AFTRA also backs the act.
SAG-AFTRA President Fran Drescher said, “Technology should exist to help humans, not replace them or rip them off. The minute we cross that line, we enter dystopia. SAG-AFTRA is committed to protecting individuals via all available means, and influencing this sort of much-needed public policy is one of the many ways we can ensure people and their intellectual property rights are protected from exploitation. The NO AI FRAUDS Act is an important step in the right direction.”
Billboard explained a key issue with AI voice cloning: “AI voice synthesis technology poses a new problem and opportunity for recording artists. While some laud it as a novel marketing, creative or fan engagement tool, it also leaves artists vulnerable to uncanny impersonations that could confuse, scam or mislead the public.”
Movieguide® previously reported why a song that cloned famous singers’ voices would not be eligible for a Grammy:
“Heart on My Sleeve,” created by an anonymous artist called Ghostwriter, clones Drake’s and The Weeknd’s vocals.
[Grammy CEO Harvey Mason Jr. said], “Let me be extra, extra clear: even though it was written by a human creator, the vocals were not legally obtained, the vocals were not cleared by the label or the artists, and the song is not commercially available, and because of that, it’s not eligible.”
“I take this [AI] stuff very seriously,” Mason continued. “It’s all complicated, and it’s moving really, really quickly. I’m sure things are going to continue to have to evolve and change. But please, please, do not be confused. The Academy is here to support and advocate for, protect, and represent human artists and human creators, period.”
The issue doesn’t just concern music. Celebrities’ voices have also been used in ads without their consent. The proposal used an example of a real ad where Tom Hanks’ voice promotes a dental plan.
Punishment for violation of the act would be a hefty $50,000 per use or more if the damages suffered by the victim are greater. That doesn’t include possible profits and punitive damages.