
Major Credit Card Companies Enable Purchase of Explicit Deepfakes
By Movieguide® Contributor
Sites selling sexually explicit “deepfake” videos of celebrities are still accepting Visa and Mastercard, despite the credit card companies claiming transactions of that nature are prohibited.
“Fan-Topia, the largest subscription website for nonconsensual sexually explicit deepfakes of celebrities, advertises the ability for subscribers to pay creators for the material with Visa and Mastercard credit cards or cryptocurrency,” NBC reported.
The outlet looked into the process, saying that they were able to use a Visa credit card and Fan-Topia’s “hidden links” system to look at profiles selling deepfakes. The “hidden links” hide the deepfake images from viewers until they have paid.
In a 2022 statement about the purchase of sexually explicit material using Visa cards, Visa CEO and chairman Al Kelly said, “Visa condemns sex trafficking, sexual exploitation, and child sexual abuse. It is illegal, and Visa does not permit the use of our network for illegal activity. Our rules explicitly and unequivocally prohibit the use of our products to pay for content that depicts nonconsensual sexual behavior or child sexual abuse. We are vigilant in our efforts to deter this, and other illegal activity on our network.”
Mastercard took a similar stance, posting on its website that it is taking “an even more active stance against the potential for unauthorized and illegal adult content,” including prohibiting the purchase of content that depicts non-consensual sexual activity.
In April, the U.K. announced that they would be criminalizing the creation of deepfakes and would prosecute creators. The U.S. has not passed similar legislation yet, but many senators and representatives have introduced bills that would criminalize deepfakes.
Sens. Dick Durbin, D-Ill.; Lindsey Graham, R-S.C.; and Josh Hawley, R-Mo. introduced the Disrupt Explicit Forged Images and Non-Consensual Edits Act, which would allow victims of deepfakes to sue the creators, with a 10-year statute of limitations.
Movieguide® previously reported on the spread of deepfake images:
Despite what some viral photos had many internet users thinking, Katy Perry did not attend this year’s Met Gala, highlighting the larger problem of misinformation spread through AI deepfakes.
While Perry has had numerous memorable outfits at the Met Gala over the years, she didn’t go to the event this year due to work in the studio. However, images of her attending the Gala in beautiful dresses were still circulated, tricking many into thinking she was there.
The images were so convincing and widespread that Perry made a post on Instagram dispelling the rumors and warning people about the dangers of AI deepfakes. Even Perry’s mom believed she decided to go to the event.
While this instance of deepfakes was not particularly malicious, it highlights the growing danger that AI poses when it comes to the spread of misinformation. While AI image generation was relatively recognizable less than a year ago, it has developed to the point that, when done correctly, it is nearly indistinguishable from real life.