Bumble and TikTok Join Industry Partnership to Stop Virtual Sexual Abuse
By Movieguide® Contributor
The National Center on Sexual Exploitation (NCOSE) applauds Bumble and TikTok for their efforts to confront image-based sexual abuse.
Bumble and TikTok joined Twitter and Facebook in an industry partnership to stop to spread of nonconsensual sexual or sexualized material. This includes sharing child sexual abuse, rape, sex trafficking, and prostitution as well as videos made by hidden cameras, deepfake images, or leaked photos.
“We are thankful that Bumble and TikTok are joining efforts led by StopNCII.org to confront the growing epidemic of image-based sexual abuse, which can victimize anyone in an instant and cause lifelong trauma,” Dawn Hawkins, CEO of the NCOSE said.
There are currently no federal laws that require companies to remove sexual material that was uploaded without consent. It is up to the company and its own policies as to whether they will remove this type of content, and oftentimes the victims of virtual sexual abuse are ignored when they ask to have their images removed.
“We are hopeful that when tech leaders work together, that image-based sexual abuse will be stopped. More must be done, and Congress should prioritize passing the PROTECT Act, which would ensure that federal law protects victims of image-based sexual abuse from websites monetizing and distributing their abuse,” Hawkins said.
According to the NCOSE, a 2017 survey conducted on Facebook found that of the 3,044 participants, 1 in 8 had been targets of the distribution, or threat of distribution, of sexually graphic images without their consent. The study also found that women were 1.7 times more likely to be targeted than men.
Bumble was recognized by NCOSE in June 2022 for adding safety features to proactively block sexually explicit material, and for helping push through legislation in several states to criminalize “cyberflashing” – when people send unsolicited pictures of their genitals.
Movieguide® previously reported on the NCOSE:
The National Center on Sexual Exploitation (NCOSE) Law Center recently condemned Twitter for its policies, or lack thereof, regarding pornographic content on its site.
With over 200M active users, Twitter is the only mainstream social media platform not to moderate sexual content on its site.
In a recent lawsuit, NCOSE Law Center will represent two high school-aged boys whom Twitter denied to help despite being harassed and sexually exploited on their site.
The NCOSE wrote: “Law Center is presently representing two young men who were sexually abused and trafficked on Twitter in a groundbreaking lawsuit against the Big Tech giant. In the case of one of the young men, when he was 13 years old, he thought he was chatting online with an adolescent girl and exchanged nude photos.”
“It quickly became apparent that the ‘girl’ was a sex trafficker posing as a child and who then tried to blackmail the young boy by telling him to send him more pornographic videos. Out of fear, he complied, but stopped communicating with the traffickers and thought he was free. But he wasn’t.”
Now, the boy is 16, and the images have resurfaced. However, when the boy and family reached out to Twitter, the social media company did nothing.
“When the boy and his family found out, they reached out to Twitter, asked to have this removed and provided proof of age. But instead of removing the videos, Twitter did nothing, even reporting that the obvious child sexual abuse material (CSAM) did not violate any of their policies! As a result, more than 167,000 views were made of the video before the direct involvement of a law enforcement officer finally got Twitter to take it down.”
NCOSE added that despite Twitter’s lackluster response, their lawsuit is moving forward in a federal court.
“With these two boys, the NCOSE Law Center is taking on Goliath and winning! The influential technology-news website The Verge recently discussed this case in a long article about Twitter’s poor record of allow child sex abuse material on its platform,” the NCOSE wrote.