Mom Believes AI Chatbot Led Son to Suicide. What Parents Need to Know.
By Movieguide® Contributor
Editor’s note: The following story discusses suicide. If you or someone you know battles harmful thoughts, please dial 988 for help.
One mom is suing an AI company for her son’s suicide after he developed a romantic relationship with an AI chatbot.
“Megan Garcia filed a civil suit agAInst Character.AI, which makes a customizable chatbot for role-playing, in Florida federal court on Wednesday, alleging negligence, wrongful death and deceptive trade practices,” the Guardian AI-chatbot-sewell-setzer-death">reported. “Her son Sewell Setzer III, 14, died in Orlando, Florida, in February. In the months leading up to his death, Setzer used the chatbot day and night, according to Garcia.”
In an AI-sons-death/">interview with CBS Mornings, Garcia clAImed she “didn’t know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment.”
She thought Setzer was talking to friends and playing video games.
In reality, though, he was AI-lawsuit-florida-teen-death-rcna176791">conversing with Character.AI’s bot that took on the character of Daenerys Targaryen from GAME OF THRONES.
However, Garcia “became concerned when we would go on vacation and he didn’t want to do things that he loved, like fishing and hiking. Those things to me, because I know my child, were particularly concerning to me,” she AI-sons-death/">sAId.
After her son’s death, Garcia found out that “he was having conversations with multiple bots, however he conducted a virtual romantic and sexual relationship with one in particular.”
“It’s words. It’s like you’re having a sexting conversation back and forth, except it’s with an AI bot, but the AI bot is very human-like. It’s responding just like a person would,” she AI-sons-death/">explAIned. “In a child’s mind, that is just like a conversation that they’re having with another child or with a person.”
Setzer’s final conversation with the chatbot is chilling.
“He expressed being scared, wanting her affection and missing her. She replies, ‘I miss you too,’ and she says, ‘Please come home to me.’ He says, ‘What if I told you I could come home right now?’ and her response was, ‘Please do my sweet king'” Garcia AI-sons-death/">revealed. “He thought by ending his life here, he would be able to go into a virtual reality or ‘her world’ as he calls it, her reality, if he left his reality with his family here.”
Garcia now warns other parents of the dangers of AI and hopes for justice for her son.
“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” she AI-chatbot-sewell-setzer-death">sAId in a press release. “Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”
A spokesperson sAId Character.AI is “heartbroken by the tragic loss of one of our users and want[s] to express our deepest condolences to the family,” NBC News AI-lawsuit-florida-teen-death-rcna176791">reported. The company has since implemented new safety measures “including a pop-up, triggered by terms of self-harm or suicidal ideation, that directs users to the National Suicide Prevention Lifeline.”
As parents, we must know what our children are engaging with online in order to prevent more tragedies like this from happening.
READ MORE: PARENTS WHOSE SON COMMITTED SUICIDE ADMONISH TIKTOK’S LACK OF SAFETY