EU Lawmakers to Hold Big Tech Accountable, Combat Child Pornography

Photo by Guillaume Perigois via Unsplash

EU Lawmakers to Hold Big Tech Accountable, Combat Child Pornography

By Movieguide® Staff

European Union lawmakers recently recognized that COVID-19 was not the only pandemic plaguing their most vulnerable citizens.

EuroNews reported a steep increase in Child Sexual Abuse Material (CSAM) in 2020.

“I will propose legislation in the coming months that will require companies to detect, report, and remove child sexual abuse [content],” EU Home Affairs Commissioner Ylva Johansson told a German newspaper, Welt am Sonntag. “A voluntary report will then no longer be sufficient.”

The outlet reported:

The problem of child sexual abuse imagery is a growing one. In June 2020, the EU law enforcement agency Europol published a report stating that the COVID-19 pandemic had led to a “surge” in the distribution of CSAM online, with some EU member states seeing as much as a 25 per cent rise in reported cases.

In 2020, internet service providers and social media platforms in the EU filed 22 million reports of child sexual abuse, Johansson told Welt am Sonntag. This is thought to be just a fraction of the actual number of incidents.

In her comments to the paper, Johansson said the fight against child abuse should be better coordinated and called for the establishment of a specialist European centre to improve prevention, law enforcement, and victim support.

Over the next few months, the EU wants to implement laws to hold social media platforms accountable and fight against child pornography.

Currently, Facebook and Instagram have neglected to report CSAM over user privacy concerns.

Apple, known for its user security and privacy protection, failed to carry out a plan to scan users’ iCloud photo libraries for CSAM.

This poses a unique challenge for lawmakers who want to combat CSAM while following privacy laws.

“As with any crime, the fight against online CSAM needs to be tackled in a way that is proportionate and lawful, meaning that interventions should be targeted against individuals or servers where there is reasonable suspicion,” European Digital Rights policy adviser Ella Jakubowska told TNW. “In contrast, the EU’s strategy seems to be to cast a dangerously wide net, proposing measures which might force service providers to scan each and every person’s private messages.”

Movieguide® previously reported:

A new report titled “Self-Generated Child Sexual Abuse Material: Youth Attitudes and Experiences in 2020,” found that the number of nude images shared amongst minors ages 9-12 doubled during the pandemic.

The share of minors ages 9-12 who are sharing self-generated nude images online more than doubled in 2020, and advocates involved in combatting online child sex abuse are worried about the trend, a new study shows.

“Self-generated child sexual abuse material has become a vital area of concern for those combating online child sexual exploitation. … [It] presents distinct risks for kids and unique challenges for the communities committed to protecting them,” researchers noted. “The interventions we pursue must be uniquely tailored to the experiences of young people and the offenders who may target them for victimization.”

One conclusion that Thorn notes in the study from the numbers are that “minors may be operating with less supervision in online spaces, particularly among 9-12-year-olds, compared to 2019 numbers.”

“Use of secondary accounts … intended to keep content private from some groups like caregivers or friends, was up most significantly among this group and 9-12-year-olds reported the most significant drop in their frequencies for following set online safety rules,” the report explained.

YouTube, TikTok, Snapchat, Facebook, Instagram, and other social media platforms are popular sites noted by the study where SG-CSAM is shared.

While SG-CSAM encompasses coercive and consensual forms of sharing, Thorn and other child advocacy organizations such as Exodus Cry are worried about the trends.

“Young people continue to engage with SG-CSAM both through exploratory and higher risk coercive pathways. The findings from our 2020 survey underscore the persistence of demographic differences in kids’ attitudes and behaviors related to SG-CSAM,” Thorn said. “Continued data collection and analysis related to this topic, along with the impact of COVID, remains a vital need to deliver successful interventions that safeguard and support young people as they navigate their digital experiences.”