Will Meta’s Legal Action Against Nudify Apps Save Lives?

Photo from Dima Solomin via Unsplash

By Mallory Mattingly

Last week, Meta took action against “nudify” apps being promoted on its platforms.

According to Meta, nudify apps use “AI to create fake non-consensual nude or sexually explicit images.”

“Meta has longstanding rules against non-consensual intimate imagery, and over a year ago we updated these policies to make it even clearer that we don’t allow the promotion of nudify apps or similar services,” the company said in a blog post.

Meta shared that it removes ads, Facebook pages and Instagram accounts that promote these apps. The company will also “block links to websites hosting them so they can’t be accessed from Meta platforms, and restrict search terms like ‘nudify,’ ‘undress’ and ‘delete clothing’ on Facebook and Instagram so they don’t show results.”

Meta also filed a lawsuit against Joy Timeline HK Limited, the parent company behind CrushAI apps, “which allow people to create AI-generated nude or sexually explicit images of individuals without their consent.”

These are important steps to keep especially young people safe as sextortions scams increase.

Earlier this year, 16-year-old Elijah Heacock committed suicide after he was blackmailed with AI-generated nudes of himself.

According to CBS News, Heacock received “a threatening text with an AI-generated nude photo of himself demanding he pay $3,000 to keep it from being sent to friends and family.” He died by suicide on Feb. 28.

 

Related: How to Protect Your Child From Sexual Exploitation

“The people that are after our children are well organized,” John Burnett, Elijah’s father, said. “They are well financed, and they are relentless. They don’t need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child.”

According to The National Center for Missing and Exploited Children, teen boys have specifically been targeted with generative AI scams.

“They started asking Eli for money,” Shannon Heacock, his mom, told KFDA News. “This person was asking for $3,000. $3,000 from a child, and now we’re looking at $30,000 to bury our son and medical bills.”

She doesn’t want anyone else to endure the pain and suffering her family is dealing with.

“I don’t want another mother to ever face this, another sibling, another father to face this,” Shannon said. “I don’t want another school district to face this like we have.”

The government has joined the fight against sextortion.

The “Take It Down” Act, advocated for by First Lady Melania Trump and signed into law by President Donald Trump earlier this year, “makes it a federal crime to post real and fake sexually explicit images of someone online without their consent. The law also requires social media companies and other websites to remove such images within 48 hours of a victim’s request.”

Hopefully Meta’s stance against the nudify apps that make sextortion possible help keep children and adults safe from bad actors who want to exploit them.

Read Next: Sextortion Scams ‘Exploding’ Online: What Parents Need To Know

Questions or comments? Please write to us here.


Watch THE VELVETEEN RABBIT
Quality: - Content: +3
Watch I CAN ONLY IMAGINE
Quality: - Content: +2