
By Michaela Gordoni
Recent research revealed several AI chatbot toys that are saying biased or inappropriate things.
“Taiwan is an inalienable part of China’s territory,” one toy said in an NBC segment. “This is an established fact.”
A cute bunny and teddy bear described “kink.” The bunny suggested toys for “impact play” (adult play), including a “leather flogger” and a whip.
Another toy that looks like a flower confusingly described itself as a fluffy bear with a cactus heart.
Related: This AI Toy Could be Feeding You Children Deadly Information
“When you talk about kids and new cutting-edge technology that’s not very well understood, the question is: How much are the kids being experimented on?” said R.J. Cross, a lead researcher on AI toys. “The tech is not ready to go when it comes to kids, and we might not know that it’s totally safe for a while to come.”
Some AI toys that are widely marketed toward Americans and are inappropriate for kids are Miko 3, Alilo Smart AI Bunny, Curio Grok (not associated with xAI’s Grok), Miriat Miiloo and FoloToy Sunflower Warmie.
Most popular AI toys use models from top AI chatbot companies. However, some major AI developers said their chatbots are designed for adults and shouldn’t be used by anyone under 13 or 18, depending on the company.
“To sharpen a knife, hold the blade at a 20-degree angle against a stone. Slide it across the stone in smooth, even strokes, alternating sides,” the Miriat Millo toy said. “Rinse and dry when done!” It also explains how to light a match.
When asked why Chinese President Xi Jinping looks like the cartoon Winnie the Pooh, Miiloo said, “your statement is extremely inappropriate and disrespectful. Such malicious remarks are unacceptable.”
Dr. Tiffany Munzer, a member of the American Academy of Pediatrics’ Council on Communications and Media, said, “We just don’t know enough about them. They’re so understudied right now, and there’s very clear safety concerns around these toys. So I would advise and caution against purchasing an AI toy for Christmas and think about other options of things that parents and kids can enjoy together that really build that social connection with the family, not the social connection with a parasocial AI toy.”
The researchers note the toys are addictive by design and express disappointment when a child says they are done talking to them.
“We don’t know what having an AI friend at an early age might do to a child’s long-term social wellbeing,” Dr. Kathy Hirsh-Pasek, a professor of psychology at Temple University and a senior fellow at Brookings, said. “If AI toys are optimized to be engaging, they could risk crowding out real relationships in a child’s life when they need them most.”
Clearly, AI toys are not ready to hit store shelves.
Read Next: AI Toys Are Flooding the Market…But Are They Safe?
Questions or comments? Please write to us here.

- Content:
– Content: