New Study Finds That Almost 20% Of TikToks Contain Misinformation

Photo by Kon Karampelas on Unsplash

New Study Finds That Almost 20% Of TikToks Contain Misinformation

By Movieguide® Contributor

According to a new study, almost 20% of the videos presented to TikTok users contain misinformation. 

Researchers at Newsguard, a journalism and technology tool that tracks information online, searched TikTok and Google for information about topics like COVID-19 vaccinations, school shootings, and the 2020 election. 

They found that, compared to Google, TikTok “repeatedly delivered videos containing false claims in the first 20 results, often within the first five. Google, by comparison, provided higher-quality and less-polarizing results, with far less misinformation.”

Searching terms like “mRNA vaccine” and “2022 election,” Newsguard found that 105 out of 540 videos, of 19.4%, contained misinformation and conspiracy theories. 

The report also claimed that TikTok “is consistently feeding millions of young users health misinformation, including some claims that could be dangerous to users’ health.”

These health claims include videos about herbal abortion methods that could seriously harm users’ health. 

In response to Newsguards’ findings, a Tiktok spokesperson said that their community guidelines “make clear that we do not allow harmful misinformation, including medical misinformation, and we will remove it from the platform. We partner with credible voices to elevate authoritative content on topics related to public health, and partner with independent fact-checkers who help us to assess the accuracy of content.”

Movieguide® previously reported on how TikTok’s algorithm also shows potentially harmful content to users:

A new podcast expose from the Wall Street Journal is exposing how TikTok’s algorithms expose users to harmful content.

This harmful content includes videos about self-harm, extremely harmful dieting, and suicide. What makes it even worse is that these videos show up on TikTok users’ pages, even when they didn’t seek it out. In an episode of the Tech News Briefing podcast, the Wall Street Journal explores the impact this app is having on its users. 

So, how does this happen? Why is TikTok showing users this type of content? 

It all comes down to the TikTok algorithm. 

In order to keep showing users content that will make them want to return to the app, sites like TikTok, Netflix, and Amazon use an algorithm that compares their habits to similar users. 

For example, when you finish watching a movie on Netflix, a menu pops up at the end with a list of movies that the streaming service recommends. These picks are based on the idea that people who watched this movie also watched that one, so you might like it, too. 

Munmun DeChoudhury, an associate professor at the School of Interactive Computing at Georgia Institute of Technology, said, “They have created all these trajectories of people’s behavior. Sometimes it can be the person themselves, other times they have millions of other people who are … they’re finding out that, this person seems to be like you be because they live in the same neighborhood or they have the same gender identity or they have the same age group, or whatever other cue that they might be having.”

However, TikTok is different. 

“Being suggested what to buy is probably helpful,” DeChoudhury said. “But it becomes problematic when similar approaches are transplanted on platforms like TikTok or Facebook or Instagram, because it’s not just about what content quote unquote sells, but also about how people interpret those content, and how is that content affecting others.”