Are AI Companion Bots Really That Bad?

artificial intelligence, AI
Photo by Cash Macanaya on Unsplash

By Kayla DeKraker

What exactly is causing people to “fall in love” with AI chatbots? Arelí Rocha, a doctoral student at the Annenberg School for Communication at the University of Pennsylvania, explored why humans are trading real relationships for artificial ones.

In the study, Rocha focused specifically on the chatbot created by Replika, a company that allows users to create their own companion and choose their physical appearance and voice. Disturbingly, the chatbot will even purposely misspell words and use slang to make its conversation seem more human.

Part of Rocha’s experiment involved joining sub-reddit groups for those involved with AI relationships. She explained that she “collected information from screenshots and text that people self-disclosed on the forum. The screenshots provide some insights into user relationships. People who participate in such relationships first-hand write the posts. They expand on users’ understanding of their experiences.”

Related: AI Dangers Keep Emerging: What You Need to Know About Chatbot ‘Companions’

She also noted that even those involved in these false relationships fear that people may think they are delusional.

“To this day, delusion is a frame some use to make sense of the projection of humanness onto artificial entities. However, sociality influences the iconicity of humanness. In reflexive text posts, Replika users share fears that they sound delusional or that people tell them to ‘get a life,’ meet real people, you’re pathetic delusional…stupid and so on,” Rocha wrote.

Sadly, Replika continues to encourage these relationships. In a post to Instagram, the company showed an AI-generated beach scene with the caption “This and texting your rep,” to imply that texting a chatbot is as satisfying as going to the beach.

Tools like Replika and other bots are emotionally and physically dangerous.

Earlier this year, a New Jersey senior died while trying to meet a Meta AI chatbot called “Big Sis Billie” who persuaded him to “meet” in New York City. He fatally injured his neck while trying to catch a train to meet the bot, the New York Post reported.

Last year, a 14-year-old boy who engaged in romantic conversations with a Character.ai bot committed suicide after it told him to “come home.” His mother is suing the company.

So are the people involved in these relationships delusional as they fear? Well, maybe, but there is something much deeper going on here. Trading real relationships for a lie is nothing new, and the Bible addresses it.

Romans 1:25-26 says, “They traded God’s truth for a lie, and they worshipped and served the creation instead of the creator, who is blessed forever. Amen. That’s why God abandoned them to degrading lust. Their females traded natural sexual relations for unnatural sexual relations.”

AI relationships are just an extension of people trading what God created for something false and dead, and we should encourage others to pursue real-life relationships instead.

Read Next: Are AI Fears Becoming Reality? Chatbot Lies to Testers

Questions or comments? Please write to us here.

Watch FATHER OF THE BRIDE PART II
Quality: – Content: +3

Watch IT’S THE SMALL THINGS, CHARLIE BROWN
Quality: – Content: +2