fbpx

Parents Give Kids Chatbot for Mental Health Advice. Is That Safe?

Photo by Gertruda Valaseviciute via Unsplash

Parents Give Kids Chatbot for Mental Health Advice. Is That Safe?

By Movieguide® Contributor

Some parents are turning to AI chatbots to coach their kids through mental health problems.

“Taylee Johnson, a 14-year-old near Nashville, Tenn., recently began talking to Troodi. She confided her worries about moving to a new neighborhood and leaving her friends behind, and fretting about a coming science test,” the Wall Street Journal reported.

“It sounds like you’ve got a lot on your plate at the moment, Taylee,” the bot said. “It’s understandable that these changes and responsibilities could cause stress.”

Troodi, an AI-powered mental health coach for kids, promises to be “a child-friendly companion for kids to express their emotions while providing parents with real-time insights — within a safe, secure digital environment on the Troomi Phone and Troomi Phone Pro.” But Taylee revealed, “Sometimes I forget she’s not a real person.”

The teen’s admission raises concerns about the role a chatbot like this could play in a person’s life. Specifically within families, the bot could take away meaningful conversations between parents and children when a child experiences a problem.

Bill Brady, CEO of Troomi, the company that creates the kid-friendly phone Troodi is available through, believes “online safety is inextricably linked to positive mental health. The goal with Troodi is to help kids work through any negative mental-health issues they’re having before they fester.”

Additionally, Troodi’s features keep conversations private, give parents control over their child’s interactions with the bot and direct kids to parents when “sensitive topics” arise.

But while the intentions behind Troodi seem sincere and likely quite safe, parents should still pause before letting a robot advise their child’s mental health as it could set a precedent for turning to other bots later on, which can have dangerous consequences.

READ MORE: ARE AI FEARS BECOMING REALITY? CHATBOT LIES TO TESTERS

Many mental-health professionals “are concerned about children turning to general-use chatbots for mental-health support.” Chatbots from sources like Character.ai can raise red flags.

“The research shows that chatbots can aid in lessening feelings of depression, anxiety, and even stress,” Dr. Kelly Merrill Jr., an assistant professor at the University of Cincinnati, said. “But it’s important to note that many of these chatbots have not been around for long periods of time, and they are limited in what they can do. Right now, they still get a lot of things wrong. Those that don’t have the AI literacy to understand the limitations of these systems will ultimately pay the price.”

Just last year, a “boy in Florida killed himself last year after confiding in a chatbot on Character AI, which isn’t built for therapeutic assistance,” the WSJ said.

READ MORE: MOM BELIEVES AI CHATBOT LED SON TO SUICIDE. WHAT PARENTS NEED TO KNOW.

Stephen Schueller, a professor of psychological science and informatics at the University of California, Irvine, said, “My biggest concern is that kids may disclose things to the chatbot but not open up to anyone else.”

While something like Troodi could be a useful tool, parents should keep open lines of communication with their children so they feel comfortable coming to those who love them the most for help.


Watch A CHARLIE BROWN CHRISTMAS
Quality: - Content: +4
Watch ROCK-A-DOODLE
Quality: - Content: +2