
By Gavin Boyle
A family is suing Roblox and Discord after their 10-year-old daughter was abducted by a man who posed as a peer on Roblox and conversed with her on Discord to learn her address.
“No child should ever be placed in harm’s way to begin with. Roblox and Discord didn’t just fail to protect her; they built and sustained the very ecosystem that allowed this predator to thrive,” said Kristen Gibbons Feden, a shareholder at Anapol Weiss, the law firm representing the family. “Roblox gave him direct access to a child. Discord gave him the tools to isolate and manipulate her. Both companies had countless chances to intervene, and they chose not to.”
The lawsuit argues that, despite Roblox and Discord’s insistence they are focused on the safety of their users, in reality, they place millions of children at risk by prioritizing profits.
“This case is horrifying, but it’s the natural consequence of how these platforms choose to operate,” said Alexandra Walsh, a partner at Anapol Weiss. “Roblox and Discord know predators are exploiting children on their platforms. They’ve known for years, and for years, refused to implement simple and essential protections. This wasn’t a loophole. It was the system working exactly as they designed it.”
For years Roblox has been accused of enabling predators to exploit children through very weak safety features. Last fall the National Center on Sexual Exploitation (NCOSE) conducted a study which found over 100,000 users are sharing child sexual abuse material across the site. Roblox also hosts games which allow predators to live out their perverted fantasies.
Related: Roblox’s New Feature Could Put User’s Safety in Jeopardy
Meanwhile Discord has come under similar fire from the organization, landing on the NCOSE Dirty Dozen list four years in a row because of policies that allow for rampant child sexual exploitation.
“Even though Discord claims to have made changes to prevent exploitation, these policies are merely performative,” NCOSE reported, saying that it and other child safety experts “have proven these safety changes to be defective.”
These platforms, like most other big tech creations, are entirely focused on profit. Any changes they make to improve their safety come from a desire to keep out of headlines and make their investors happy, rather than from a care for their users. This business model, however, has been proven time and time again to lead to almost insignificant change that is easily circumvented.
Hopefully this lawsuit will go in favor to the family and these companies will actually be held accountable for once, rather than being able to shrug off their failings as they have in the past.
Read Next: This Chat App Your Child Probably Uses Exposes Them to Predators
Questions or comments? Please write to us here.