
By India McCarty
A new analysis of TikTok finds that the video app pushes mental health content to users at a rate higher than almost any other topic.
“TikTok’s algorithm favors mental health content over many other topics, including politics, cats and Taylor Swift,” an analysis conducted by The Washington Post stated, adding that mental health content is “stickier” than other videos.
This “stickiness” means it’s easier for your algorithm to find more mental health-related content to put on your feed, and that it’s harder to get rid of, even if you stop watching videos related to the topic.
Related: TikTok Employees Know Just How Dangerous the App Is
Kailey Stephen-Lane told the outlet she actually had to stop using the app because of this; she has OCD, and TikTok bombarded her with videos about the condition, worsening her symptoms.
“The TikToks that I’ve been getting are not helpful to my recovery,” she said. “They lead me down a lot of spirals, and me just clicking ‘not interested’ doesn’t seem to work anymore.”
Complicating matters? The videos are not always accurate, leading viewers to have an incorrect view of things like depression, autism, and other related subjects.
“The algorithm says, ‘Well, you like this video about ADHD, even though it’s misleading, let’s give you another video,’” Anthony Yeung, a psychiatrist and University of British Columbia researcher, said. “And it becomes this very vicious feedback loop of misinformation.”
This practice of pushing mental health content can have darker side effects. New Amnesty International research alleges the app promotes content to young users that pushes them towards depression, suicidal ideation, and self-harm.
“Our technical research shows how quickly teenagers who express an interest in mental health-related content can be drawn into toxic rabbit holes. Within just three to four hours of engaging with TikTok’s ‘For You’ feed, teenage test accounts were exposed to videos that romanticized suicide or showed young people expressing intentions to end their lives, including information on suicide methods,” said Lisa Dittmer, Amnesty International’s Researcher on Children and Young People’s Digital Rights.
TikTok is currently being sued by 14 attorneys general who allege the app’s For You page algorithm has falsely advertised that it isn’t addictive.
In newly-unsealed documents surrounding TikTok’s own internal research, it was revealed that the app was aware that “users only need to watch 260 videos before they could become addicted to the app,” and that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.”
The documents also acknowledged that the algorithm had “better engagement” with young people.
While TikTok’s mental health content might seem helpful as people attempt to navigate their own diagnoses, it’s clear the app is pushing the videos to their users — no matter how harmful they might be.
Read Next: Is TikTok’s Algorithm Really as Dangerous as We Think?
Questions or comments? Please write to us here.

- Content: