
By Gavin Boyle
Newly unsealed documents from a years-long court case against social media companies making their products addictive revealed that Meta’s internal researchers identified its platforms were not only extremely addictive but also had negative effects on users’ mental wellbeing.
“Meta has designed social media products and platforms that are addictive to kids, and they’re aware those addictions lead to a whole host of serious mental health issues,” said Previn Warren, the co-lead attorney for the plaintiffs in the case. “Like tobacco, this is a situation where there are dangerous products that were marketed to kids. They did it anyway, because more usage meant more profits for the company.”
While social media’s dissemination across the country, specifically to minors, has been compared to tobacco products in the past, this court case has made the parallels even more solid as it revealed Meta has known its products are destructive since at least 2017.
Related: Surgeon General Recommends Warning Labels for Social Media
“[O]h my gosh y’all, [Instagram] is a drug,” one internal researcher wrote.
“Teens talk of Instagram in terms of an ‘addicts narrative’ spending too much time indulging in a compulsive behavior that they know is negative but they feel powerless to resist,” reads a statement from another internal researcher.
Furthermore, the court case revealed multiple studies run by Meta where users were asked to stop using their social media accounts for a week. Initial studies found these users reported improved well-being during that week. These studies were then discontinued and the preliminary results were never published.
These unsealed internal documents could be extremely damaging to Meta’s reputation and could potentially lead to the implementation of nationwide laws surrounding social media – something that has already been gaining momentum since the U.S. Surgeon General called for a Surgeon General’s warning to be placed on social media in 2023. Furthermore, Meta’s implementation of safety features like teen accounts may also be seen in a new light as a way to appear to the public like it is protecting young users without making any changes to the features it knows are actually harmful.
“We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformation opinions in an attempt to present a deliberately misleading picture,” said Meta spokesperson Andy Stone who claimed the company has been cautious with how it treats its younger users.
“The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens – like introducing Teen Accounts with built-in protections and providing parents with controls to manage their teens’ experiences,” Stone continued. “We’re proud of the progress we’ve made and we stand by our record.”
More than ever, parents should be wary of Meta and the steps it has taken to protect young users. It is clear that the company has known its products are destructive for years and has done little to make significant changes. Until the company can explain why it ignored multiple studies that found its products had a negative effect on mental well-being, users should not trust that Meta has its best interests in mind.
Read Next: FTC Can Reopen Child Privacy Investigation Against Meta
Questions or comments? Please write to us here.

- Content: