fbpx

Former TikTok Moderators Sue App Over ‘Extremely Disturbing’ Content

 

Former TikTok Moderators Sue App Over ‘Extremely Disturbing’ Content

By Movieguide® Contributor

Two former TikTok moderators are suing the video sharing app after claiming they experienced emotional trauma after seeing “highly toxic and extremely disturbing” videos every day. 

TikTok moderators review videos posted on the app and determine if they break any of the site’s content rules and guidelines. 

“We would see death and graphic, graphic pornography. I would see nude underage children every day,” Ashley Velez said. “I would see people get shot in the face, and another video of a kid getting beaten made me cry for two hours straight.”

Velez and another moderator, Reece Young, have filed a federal lawsuit seeking class action status against TikTok and its parent company, ByteDance. 

“You see TikTok challenges, and things that seem fun and light, but most don’t know about this other dark side of TikTok that these folks are helping the rest of us never see,” said lawyer Steve Williams of the Joseph Saveri Law Firm, which filed the case.

According to the suit, Young and Velez were in an unsafe work environment because TikTok did not provide the proper mental health treatment to deal with the anxiety, depression, and post-traumatic stress they experienced after watching such disturbing content.

TikTok has not officially commented on the suit, but a spokesperson for the company said that it “strives to promote a caring working environment for our employees and contractors.”

Though it’s the No. 1 app, TikTok continually proves to be more harm than good for users, dispersing videos according to damaging algorithms.

Movieguide® has previously reported on the harmful content TikTok’s typically-underage users are exposed to:

The Wall Street Journal recently published a report that shed light on how the social media site TikTok exposes minors to pornographic content and drug usage through search algorithms.

The report, titled “How TikTok Serves Up Sex and Drug Videos to Minors,” conducted its experiment by creating fake accounts that represented users between 13 and 15 to observe what content the app emphasized.

“TikTok served one account registered as a 13-year-old at least 569 videos about drug use, references to cocaine and meth addiction, and promotional videos for online sales of drug products and paraphernalia. Hundreds of similar videos appeared in the feeds of the Journal’s other minor accounts,” the report found. “TikTok also showed the Journal’s teenage users more than 100 videos from accounts recommending paid pornography sites and sex shops. Thousands of others were from creators who labeled their content as for adults only.”

The Wall Street Journal sent 974 videos to TikTok, noting that adult videos and drug content were “served to the minor accounts — including hundreds shown to single accounts in quick succession.”

The Christian Post reported that of the 974 videos in question, 169 of them were removed before the newspaper sent the content to TikTok. TikTok removed an additional 255 videos after the report was published.