fbpx

How TikTok’s Algorithms Show Users Potentially Harmful Content

Photo by Kon Karampelas on Unsplash

How TikTok’s Algorithms Show Users Potentially Harmful Content

By Movieguide® Contributor

A new podcast expose from the Wall Street Journal is exposing how TikTok’s algorithms expose users to harmful content.

This harmful content includes videos about self-harm, extremely harmful dieting, and suicide. What makes it even worse is that these videos show up on TikTok users’ pages, even when they didn’t seek it out. In an episode of the Tech News Briefing podcast, the Wall Street Journal explores the impact this app is having on its users. 

The episode focuses on Perri Kornbluh, an aspiring nurse from Rockland County, New York, who joined TikTok in March 2020 when she was 19.

“When I first got on TikTok, it was because I was bored in quarantine. I had no idea what I was getting into. In the beginning I did see a lot of the Tiger King Dance, Carole Baskin,” Kornbluh said.

However, the content soon took a darker turn. 

About a month after she joined the app, Kornbluh started seeing videos about eating disorders. Instead of scrolling past them, she stopped and watched – and the app’s algorithm noticed. 

“I saw a lot of videos of people saying, ‘What I eat in a day,’ and they were severely under-eating. They were showing three pieces of this or three pieces of that,” Kornbluh said.

The algorithm targeted Kornbluh’s hidden secrets; she’d struggled with disordered eating in the past; she started dieting before her wedding, but then never really stopped. The TikToks she saw amplified her inner turmoil.  

“I’d never admitted that I had that problem,” Kornbluh confessed. “As soon as I got on TikTok and I started seeing all these other people looking sicker than me or doing stuff, and I was like, ‘Oh, I’m not that bad. I don’t really think I have a problem. I don’t need to get help. Look at them. They’re the ones who are really sick.’” 

Kornbluh wasn’t the only one being impacted by what she saw on TikTok.

Beth Hartman, a high school freshman in Manitowoc, Wisconsin, joined the app to watch dance videos and participate in the viral dance challenges herself. She, too, was soon bombarded with videos from users where they detailed what they ate each day – and she stopped to watch them. 

“The videos were only a hundred calories a day, and then are saying, like, ‘Oh, this is way too much. Now I got to go exercise for an hour and a half to get rid of it,’” Hartman said.

Like Kornbluh, she was also already struggling with an eating disorder. 

“I was already in my eating disorder,” Hartman shared. “It made me feel really bad myself, and they made me think that maybe I should start picking up these behaviors because they have an eating disorder and they have these behaviors.”

So, how does this happen? Why is TikTok showing users this type of content? 

It all comes down to the TikTok algorithm. 

In order to keep showing users content that will make them want to return to the app, sites like TikTok, Netflix, and Amazon use an algorithm that compares their habits to similar users. 

For example, when you finish watching a movie on Netflix, a menu pops up at the end with a list of movies that the streaming service recommends. These picks are based on the idea that people who watched this movie also watched that one, so you might like it, too. 

Munmun DeChoudhury, an associate professor at the School of Interactive Computing at Georgia Institute of Technology, said, “They have created all these trajectories of people’s behavior. Sometimes it can be the person themselves, other times they have millions of other people who are … they’re finding out that, this person seems to be like you be because they live in the same neighborhood or they have the same gender identity or they have the same age group, or whatever other cue that they might be having.”

However, TikTok is different. 

“Being suggested what to buy is probably helpful,” DeChoudhury said. “But it becomes problematic when similar approaches are transplanted on platforms like TikTok or Facebook or Instagram, because it’s not just about what content quote unquote sells, but also about how people interpret those content, and how is that content affecting others.”

According to the Wall Street Journal, 

Earlier this year, the Wall Street Journal’s investigative team set up more than a hundred automated TikTok accounts or bots, to try and find out how TikTok’s algorithm worked. The reporters filled out a TikTok profile for each bot with an age and a location. Some were registered as 13 year olds. Separately the Wall Street Journal team assigned the bots, a short set of interests, topics like politics or astrology, and the bots were set loose to watch videos on TikTok for several days to see how quickly TikTok could learn their interests. At first, just like Beth and (Perri), the bots were shown a range of content, but then the algorithm started zeroing in on what would keep them watching. The team learned that unlike other platforms, TikTok’s algorithm only seemed to need one metric to figure its users out. What they stopped and watched. The bots that lingered over weight loss and exercise videos were quickly served more, until these topics made up more than half of the bots’ feed. One third of the weight loss videos were about eating disorders.

When users stop to watch a video, content like it automatically gets sent to their page. Users like Kornbluh and Hartman stopped on videos about eating and weight loss, which led to the flood of similar clips. 

“TikTok has said WSJ’s bot experiment doesn’t reflect real user behavior, because humans have more diverse sets of interests,” said Zoe Thomas, the podcast host. “But they said even one person having that experience is one too many. TikTok users are aware this is how the system works, but Beth Hartman says she prefers her TikTok feed over other social media platforms she’s on.” 

“TikTok says, if users don’t like what they see, they can click a not interested button, which blocks the videos from a specific creator or ones with the same audio,” Thomas continued. “But (Perri) and Beth say the button didn’t change their feed. Beth found herself idolizing the people in the harmful content she saw on TikTok.”

Both Kornbluh and Hartman said that their eating disorders had gotten much worse after months on TikTok, and thus far, there doesn’t seem to be a solution.

However, some of these devastating problems could be addressed by parents teaching their children and teens media wisdom to discern what kind of content they’re consuming, even on social media.

Movieguide® Founder and Publisher Ted Baehr suggests teaching children and teens the following questions in order to properly discern the media:

  1. What kind of role models, positive and negative, are the main characters?
  2. Who is the hero? And, who is the villain? And, how do their character traits agree with a biblical hero or villain?
  3. Do the moral statements and themes agree with a biblical worldview?
  4. Are real consequences to sin exposed and rebuked?
  5. How are relationships and love portrayed?
  6. How are Christians, religion, the church, the Bible, and God portrayed?
  7. Does the language honor God and people?
  8. If violence is included, how is it presented?
  9. If physical romantic activity is included, how is it presented?
  10. How appropriate is this material for my family and me?

 Furthermore, Baehr offers these five keys to media wisdom: 

Key 1:  Understand the influence of the media on your children. In the wake of the Columbine High School massacre, CBS President Leslie Moonves put it quite bluntly: “Anyone who thinks the media has nothing to do with this is an idiot.” The major medical associations have concluded that there is absolutely no doubt that those who are heavy viewers of violence demonstrate increased acceptance of aggressive attitudes and aggressive behavior. Of course, media is only one part of the problem – a problem that could be summed up with the sage biblical injunction, “Do not be misled: ‘Bad company corrupts good character’” (1 Cor. 15:33). As the results of thousands of studies on youth violence prove, watching media violence causes violence among children. Bad company corrupts good character – whether that bad company is gangs, peer pressure or violent movies, video games and television programs.

Key 2:  Ascertain your children’s susceptibility at each stage of cognitive development. Not only do children see the media differently at each stage of development, but also different children are susceptible to different stimuli. As the research of the National Institute of Mental Health revealed many years ago, some children want to copy media violence, some are susceptible to other media influences, some become afraid, and many become desensitized. Just as an alcoholic would be inordinately tempted by a beer commercial, so certain types of media may tempt or influence your child at his or her specific stage of development.

Key 3:  Teach your children how the media communicates its message. Just as children spend the first 14 years of their lives learning grammar with respect to the written word, they also need to be taught the grammar of twenty-first-century mass media so that they can think critically about the messages being programmed for them.

Key 4:  Help your children know the fundamentals of Christian faith. Children need to be taught the fundamentals of Christian faith so that they can apply their beliefs and moral values to the culture and to the mass media of entertainment. Of course, parents typically have an easier time than teachers with this Key because they can freely discuss their personal beliefs. Yet, even so, it is interesting to note that cultural and media literacy and values education are two of the fastest growing areas in the academic community – a trend most likely due to the fact that educators are beginning to realize that something is amiss.

Key 5:  Help your children learn how to ask the right questions. When children know the right questions to ask, they can arrive at the right answers to the problems presented by the mass media of entertainment. For instance, if the hero in the movie your child is watching wins by murdering and mutilating his victims, will your children be able to question this hero’s behavior, no matter how likable that character may be?

Now more than ever we’re bombarded by darkness in media, movies, and TV. Movieguide® has fought back for almost 40 years, working within Hollywood to propel uplifting and positive content. We’re proud to say we’ve collaborated with some of the top industry players to influence and redeem entertainment for Jesus. Still, the most influential person in Hollywood is you. The viewer.

What you listen to, watch, and read has power. Movieguide® wants to give you the resources to empower the good and the beautiful. But we can’t do it alone. We need your support.

You can make a difference with as little as $7. It takes only a moment. If you can, consider supporting our ministry with a monthly gift. Thank you.

Movieguide® is a 501c3 and all donations are tax deductible.


Now more than ever we’re bombarded by darkness in media, movies, and TV. Movieguide® has fought back for almost 40 years, working within Hollywood to propel uplifting and positive content. We’re proud to say we’ve collaborated with some of the top industry players to influence and redeem entertainment for Jesus. Still, the most influential person in Hollywood is you. The viewer.

What you listen to, watch, and read has power. Movieguide® wants to give you the resources to empower the good and the beautiful. But we can’t do it alone. We need your support.

You can make a difference with as little as $7. It takes only a moment. If you can, consider supporting our ministry with a monthly gift. Thank you.

Movieguide® is a 501c3 and all donations are tax deductible.