fbpx

Are Social Media Companies Actually Protecting Kids from Deadly Content?

Photo from Muhammad Asyfaul via Unsplash

Are Social Media Companies Actually Protecting Kids from Deadly Content?

By Movieguide® Contributor

Meta is collaborating with Snap and TikTok to find new ways to prevent the distribution of self-harm content on their apps.

“We’ve worked with the Mental Health Coalition to establish Thrive, a new program that allows tech companies to share signals about violating suicide or self-harm content and stop it spreading across different platforms,” Meta announced.

The company stated that although they’ve previously set up protections internally, they believe a multi-app partnership will be more effective.

“To be truly effective in responding to this content, tech companies need to work together,” they said. “That’s why we’ve worked with the Mental Health Coalition to establish Thrive, the first signal sharing program to share signals about violating suicide and self-harm content.”

When any self-harm or suicidal content is posted on one app, the Thrive system will alert the other apps so action can be taken to prevent it from being posted there as well.

“Meta said it is using technology it created and uses in conjunction with the Tech Coalition’s Lantern program — which aims to make technology safe for children and includes companies like Amazon, Apple, Google, Discord, OpenAI and more — to ensure that data is shared in Thrive securely,” NBC reported.

Meta claims that during the summer, they worked to take down over 12 million pieces of self-harm and suicidal content.

“Between April and June this year, we took action on over 12 million pieces of suicide and self-harm content on Facebook and Instagram,” the blog post explained. “While we allow people to discuss their experiences with suicide and self-harm — as long as it’s not graphic or promotional — this year we’ve taken important steps to make this content harder to find in Search and to hide it completely from teens, even if it’s shared by someone they follow.”

“The integration of signal sharing, coupled with cross-industry collaboration and moderated by an independent and neutral intermediary, represents a major breakthrough in industry collaboration and public protection on the global, public health crisis of suicide and ultimately save lives,” said Thrive’s director, Dr. Dan Reidenberg.

Social media companies have received backlash for their lack of protections for teens. Most recently, Snapchat has received criticism for its lack of measures to prevent bullying and was sued in New Mexico. Movieguide® previously reported:

The lawsuit claims that certain Snapchat features enable predators to gain access to children. The lawsuit reads, “Snap’s features, including its algorithms, which mine patterns of consumption by users to recommend content and other users aligned with their interests to operate to match children with adult predators and drug dealers and deliver a string of sexualized, drug-related, or other dangerous content to children, predators, and others.”

The lawsuit continues by stating that Snapchat failed to implement the correct safety measures to protect users.

The lawsuit also noted the rape of an 11-year-old girl, who a predator found through Snapchat.

Although it is great that companies are working together to improve the safety of social media, these platforms come with risks. It is ultimately up to parents to protect their children and monitor their activity online.