
By Mallory Mattingly
As people continue to battle of anxiety, depression and other mental health issues, many rely on AI chatbots for faux therapy sessions.
Following a survey of 1,060 teens, Common Sense Media reports that nearly 72% of teenagers have turned to an AI companion like Character.ai, Nomi and Replika.
Thirty-three percent of respondents admitted they use AI companions for social interaction and relationships, while 30% use it for entertainment purposes. Shockingly, one-third (33%) “of users choose AI companions over humans for serious conversations.”
AI is “never going to replace human connection,” therapist, psychologist and researcher Vaile Wright said on the “Speaking of Psychology” podcast. “That’s just not what it’s good at.”
But according to the Harvard Business Review, companionship and therapy are the number one reasons why people turn to these chatbots.
But that’s a really bad idea.
Wright explained that these AI chatbots were built “to keep you on the platform for as long as possible because that’s how they make their money.”
Because money is the bots’ end goal, they “basically tell people exactly what they want to hear,” Wright explained. “So if you are a person that, in that particular moment, is struggling and is typing in potentially harmful or unhealthy behaviors and thoughts, these types of chatbots are built to reinforce those harmful thoughts and behaviors.”
She noted that while AI can pull from tons of data, they don’t understand human emotions.
“An AI chatbot, unfortunately, knows that some legal drug use makes people feel better,” she said. “It gives you a high, and if somebody is saying I’m low and depressed, that might be advice it gives. But it doesn’t understand that you don’t give that advice to people in recovery from illegal drug use.”
Omri Gillath, professor of psychology at the University of Kansas, echoed Wright’s sentiments, saying that AI can provide “momentary advantages and benefits. It’s always going to be polite and always going to say the right things.”
But AI cannot and will never be able to fulfill the need for deep relationships.
“AI cannot introduce you to their network,” Gillath told CNBC’s Make It. “A hug would be so much more meaningful and helpful and beneficial.”
As teens turn to AI for connection and therapy, parents should remind them that real-world friendships will mean so much more in the long run than any virtual “friend.”
Read Next: What Will Happen to AI Chatbots Providing Mental Health Advice to Teens?
Questions or comments? Please write to us here.