
Children Creating Child Abuse Material With AI, Experts Say
By Movieguide® Contributor
As young people’s understanding of and skill with AI increases, some teens are using the technology to create potentially exploitive content of other children.
A study found that 74% of 16 to 24-year-olds in the UK have used Open AI’s ChatGPT, Snapchat AI or Google Bard.
“Students using AI regularly is now commonplace. In fact, their understanding of AI is more advanced than most teachers – creating a knowledge gap. This makes keeping pupils safe online and preventing misuse increasingly difficult.” Tasha Gibson, Online Safety Manager at RN Technology, said.
“With AI set to grow in popularity, closing this knowledge gap must become a top priority,” she added.
Although most British teens implement AI into their daily lives, 58% of UK internet users are “concerned about [AI’s] future impact on society.”
One of the biggest concerns is that “children are making indecent images of other children using artificial intelligence (AI) image generators,” the BBC reported.
“It said children might need help to understand that what they were making was considered child abuse material,” the source added.
“[We] need to see steps being taken now, before schools become overwhelmed and the problem grows,” said David Wright, director of the UK Safer Internet Center.
“Young people are not always aware of the seriousness of what they are doing, yet these types of harmful behaviors should be anticipated when new technologies, like AI generators, become more accessible to the public,” he continued. “An increase in criminal content being made in schools is something we never want to see, and interventions must be made urgently to prevent this from spreading further.”
Victoria Green, the CEO of the Marie Collins Foundation, a charity that helps sexually abused children, explained that the real danger comes from who could get ahold of the AI-generated materials.
“The imagery may not have been created by children to cause harm but, once shared, this material could get into the wrong hands and end up on dedicated abuse sites,” Green stated.
“There is a real risk that the images could be further used by sex offenders to shame and silence victims,” she concluded.