
By Michaela Gordoni
Google Gemini will soon be available to kids under the age of 13.
The AI tool will only be available to kids under Family Link, a Google parental control system. This allows parents to decide how long their kids spend on certain apps and controls what they can access, Tom’s Guide reported this week.
Family Link also allows parents to see their child’s activity on Gemini and receive notifications when their child uses it. Educational institutions that use Google Workspace will have controls for Gemini student usage, ET Edge Insights reported.
Google hasn’t stated what rules will be in place for kids on the platform, but it has said it won’t use their activity to train its AI tools. In the past couple of years, Google has made it a priority to identify how AI tools could be unsafe for kids and add safety guidelines.
“Gemini Apps will soon be available for your child,” Google said this week. “That means your child will be able to use Gemini to ask questions, get homework help, and make up stories.”
The company admitted that the AI tool can make mistakes and encouraged parental assistance when children use the tool.
Related: Google’s AI Gemini Can Now ‘Reason’ … But What Does That Mean?
Kids will be able to use the Gemini apps on the web and phone app stores, use Gemini as a mobile assistant on Android devices, use the GenAI chatbot on Android devices—even when the device is locked—and use Google’s assistant’s help feature, Inside Halton reported. Parents will be notified when a child starts to use the tool.
Google warned that “Gemini isn’t human” and “Even though it sometimes talks like one, it can’t think for itself, or feel emotions.”
Gemini’s FAQ page explains that the AI tool can “hallucinate.” This is a behavior when the model lists information that is false but states it in an authoritative, convincing way. This could present as information about a person, topic, or any answer to the user’s query.
Chatbots like Gemini are intended to be helpful assistants. Companion bots, like Character.ai, are more dangerous. Companion bots pose as friends or lovers, and they can offer advice and say things that are persuasive and extremely harmful to young people. Character.ai and other companion bot AI tools are currently facing lawsuits for causing harm.
It seems like Google wants to make Gemini safe for kids. Hopefully, it will tread slowly and examine whether its tool is ultimately helpful or harmful for kids going forward.
Read Next: Google Releases ‘Most Capable’ AI Model to the Public