Recent Lawsuit Proves Once Again That Roblox Isn’t Safe for Kids

Photo from Oberon Copeland via Unsplash

By Kayla DeKraker

Roblox is undergoing a public legal battle after one dad accused the game of exploiting his then 13-year-old son.

According to the boy’s father, Steve, his son met someone on the platform posing as a 16-year-old. The perpetrator convinced the boy to move the conversation onto another gaming app, Discord, where he then offered Robux gift cards, Roblox’s in-game currency, in exchange for explicit content from the minor.

The predator also threatened the boy after he didn’t show up for an in-person meet up.

“He had our home address, what school he went to, his phone number, everything,” Steve said.

A Roblox spokesperson told ABC News of the lawsuit, “We are deeply troubled by any allegations about harms to children online and are committed to setting the industry standard for safety. To protect our users, we have rigorous, industry-leading policies, including limiting chat for younger users and employing advanced filters designed to block the sharing of personal information. Roblox also does not allow users to share images or videos. We also collaborate closely with law enforcement.”

Related: Roblox Isn’t Safe for Your Kids. In Fact, It’s a ‘Pedophile Hellscape’

Roblox attempted to keep the lawsuit, which is taking place in San Mateo, California, private. But last week, California Superior Court Judge Nina Shapirshteyn ruled that the ongoing case will remain public.

This case is just one of many exposing the predatory practices that take place on the popular gaming platform.

One law firm has filed over 15 lawsuits across the nation against Roblox for families in similar situations. The Law Firm Chronicle explained, “The firm asserts that these dispute clauses, hidden within user agreements, are legally unenforceable against children who lack the capacity to enter such contracts, and represent a significant barrier to justice for some of society’s most vulnerable online users.”

“It is morally indefensible and legally questionable for a company like Roblox to hide behind an arbitration clause when children have been sexually exploited and harmed through activities facilitated on its platform,” said Matthew Dolman of Dolman Law Group of the many cases against the platform. “Children cannot legally consent to contracts in the same way adults can, and these clauses are designed to silence survivors, obscure corporate accountability, and keep these heinous acts out of the public eye.”

He added, “We are fighting to ensure these young survivors get their day in court and that justice is served transparently.”

Discord, the other app involved, has also undergone scrutiny for its potential lack of child safety measures.

Earlier this year, the platform made it onto National Center on Sexual Exploitation’s “Dirty Dozen” social media list for the fourth year in a row.

NCOSE said of Discord, “Predators flock to Discord to coerce children into sending them sexually explicit images, also known as child sexual abuse material (CSAM), which they then trade amongst each other on different servers. They also frequently use artificial intelligence to create CSAM and image-based sexual abuse (IBSA, a form of sexual violence that includes the non-consensual creation or distribution of sexually explicit images).”

As online platforms prove time and time again that they are not safe, parents need to stop relying on these companies to supervise their children and either set boundaries as parents or make the wise choice to say no to dangerous gaming and social media use.

Read Next: Is Discord Safe for Your Child? Here’s What Parents Need to Know

Questions or comments? Please write to us here.

Watch A CHARLIE BROWN CHRISTMAS
Quality: – Content: +4

Watch LET THERE BE LIGHT
Quality: – Content: +1