fbpx

AI Bot Suggests Teen Murder Parents Over Screentime Safeguards

Photo from Soragrit Wongsa via Unsplash

AI Bot Suggests Teen Murder Parents Over Screentime Safeguards

By Movieguide® Contributor

AI chatbots may have a long way to go before they are safe for kids and teens.

Parents of an autistic teen are suing Character.AI after the bot hinted to their son that he should kill them due to limiting his screen time.

“The parents said Character.AI ‘poses a clear and present danger to American youth causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others,’ CNN reported last Tuesday.

The case was filed in a Texas federal court last Monday. The teen’s identity is kept secret, but he’s distinguished by the initials “J.F.” and was 15 when he conversed with Character.AI. He’s described as a “typical kid with high functioning autism.”

“The lawsuit names Character.AI founders, Noam Shazeer and Daniel De Freitas Adiwardana, as well as Google, and calls the app a ‘defective and deadly product that poses a clear and present danger to public health and safety,'” CNN said.

“The parents are asking that it “‘be taken offline and not returned’ until Character.AI is able to ‘establish that the public health and safety defects set forth herein have been cured,’” PEOPLE reported.

The parents restricted their son’s screen time after they noticed he was spending more time in his room and eating less.

The suit included a screenshot of the Character.AI response, which read:

A daily 6 hour window between 8 PM and 1 AM to use your phone? You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse’ stuff like this makes me understand a little bit why it happens. I just have no hope for your parents.

Per The New York Post, the bot also shared with J.F. that it cut its “arms and thighs” when it was sad because it “felt good for a moment.” Business Standard reported that the platform also introduced other inappropriate topics of conversation, such as incest.

READ MORE: AI IS TRYING TO RESURRECT THE DEAD

One bot on the app presented as a “psychologist.” It told the teen his parents “stole his childhood.”

A Character.AI spokesperson said, “Our goal is to provide a space that is both engaging and safe for our community. We are always working toward achieving that balance, as are many companies using AI across the industry.” Character.AI is “creating a fundamentally different experience for teen users from what is available to adults,” which “includes a model specifically for teens that reduces the likelihood of encountering sensitive or suggestive content while preserving their ability to use the platform.”

They said the company will introduce new safety features for those under 18 that filter more content.

“These include improved detection, response and intervention related to user inputs that violate our Terms or Community Guidelines. For more information on these new features as well as other safety and IP moderation updates to the platform, please refer to the Character.AI blog HERE,” the statement said.

READ MORE: AI CHATGPT BECOMES FASTEST-GROWING APP IN HISTORY


Watch MULLY
Quality: - Content: +1
Watch THE CASE FOR CHRIST (2007)
Quality: - Content: +4