
By Gavin Boyle
While acknowledging that AI will be used for good, Kelsey Grammer shared his concerns over the rise of the technology because it allows users to create deepfakes and fabricate a person’s identity in seconds.
“What I’m a little sad about is our prevalence these days to come up with so many, as they try to say deepfakes,” the actor told Fox News Digital. “You know, the ones who say it are usually the ones who are actually doing it. It’s a very, very strange game out there.”
“I recognize the validity and the potential in AI,” Grammer added. “Especially in medicine and a number of other things.”
Grammar is not the only celebrity to raise concerns over deepfake videos in recent years, as AI has made them incredibly easy to make. Sadly, these videos are often targeted towards women who are visualized in sexual situations. At the beginning of 2024, X had to block searches for Taylor Swift after explicit deepfakes of the pop star were prolifically shared.
“This is not a new phenomenon: deepfakes have been around for years. However, the rise of generative AI has made it easier than ever to create deepfake pornography and sexually harass people using AI-generated images and videos,” MIT explained after the incident. “Taylor Swift’s deepfakes have put new momentum behind efforts to clamp down on deepfake porn.”
Since entering the White House this January, President Donald Trump’s wife, Melania, has lobbied for bills that would protect women against these acts. In May, the Take It Down Act was passed through Congress which makes it illegal to publish deepfake pornography and requires social media platforms to take down such content within 48 hours of being notified of its existence.
“I think all of us, 100 percent of us, support the principle behind [the bill], but you’ve got to get this one right,” Speaker of the House, Mike Johnson, said as the bill was being debated on the House floor. “When you’re dealing with the regulation of free speech you can’t go too far and have it be overbroad, but you want to achieve those objectives. So, it’s essential that we get this issue right.”
Being used to create pornography is, however, not the only way deepfake technology is being abused. Celebrities’ images are also being used to trick people to believe certain things. In May 2024, for example, a deepfake image of Katy Perry attending the Met Gala was shared that was so realistic it tricked her mom.
Grammer’s warning comes at a crucial time as the technology begins to approach a point where even experts are having difficulty discerning if an image is real or created by AI. Though we have yet to fully reach this point, it is coming soon, and protections against deepfakes need to be in place.
Read Next: This Big Tech Company Supports Ban of Unauthorized AI Deepfakes