
Report: TikTok Must Address Child Pornography Accounts On App
By Movieguide® Staff
After a recent review of the popular social media platform TikTok, Forbes reported that the app is home to child pornography.
Known as child sexual abuse material (CSAM) in legal terms, Forbes writer Alexandra Levine reported that the criminal activity was not an isolated instance but running rampant on the app.
Levine wrote that the posts connected with the child sexual abuse material are, at first glance, unalarming.
“But often,” she said, “they’re portals to illegal child sexual abuse material quite literally hidden in plain sight — posted in private accounts using a setting that makes it visible only to the person logged in.”
According to the report, the accounts sharing CSAM use “post-in-private” settings to ensure the content bypasses the app’s standard algorithms and is only visible to people with the necessary account information.
A survivor of child sexual abuse and an advocate for children’s safety, Seara Adair said that she contacted TikTok about the problem, noting that it could lead to widespread exploitation.
“There’s quite literally accounts that are full of child abuse and exploitation material on their platform,” she told Forbes. “Not only does it happen on their platform, but quite often, it leads to other platforms — where it becomes even more dangerous.”
The predators running the accounts reportedly try to recruit girls as young as 13-years-old.
Haley McNamara, director of the International Centre on Sexual Exploitation said that the issue extends beyond TikTok.
“There is this trend of either closed spaces or semi-closed spaces that become easy avenues for networking of child abusers, people wanting to trade child sexual abuse materials,” she told Forbes. “Those kinds of spaces have also historically been used for grooming and even selling or advertising people for sex trafficking.”
TikTok’s spokesperson Mahsau Cullinane said the social media app has “zero tolerance for child sexual abuse material and this abhorrent behavior, which is strictly prohibited on our platform.”
“When we become aware of any content, we immediately remove it, ban accounts, and make reports to [the National Center for Missing & Exploited Children],” Cullinane added.
Movieguide® previously reported:
Several state attorneys general recently launched an investigation into the video-sharing app TikTok and its alleged connection to users’ poor mental health.
California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee, and Vermont led the investigation after a recent concern over the harmful effects of social media on young users and a lack of accountability of major tech companies.
“Our children are growing up in the age of social media — and many feel like they need to measure up to the filtered versions of reality that they see on their screens,” California Attorney General Rob Bonta said in a news release. “We know this takes a devastating toll on children’s mental health and well-being.”
Read the Forbes report: These TikTok Accounts Are Hiding Child Sexual Abuse Material In Plain Sight