fbpx

Looming New Supreme Court Hearing Could Hold Google, Twitter, and Other Big Tech Platforms More Accountable

Photo by Tingey Injury Law Firm via Unsplash

Looming New Supreme Court Hearing Could Hold Google, Twitter, and Other Big Tech Platforms More Accountable

By Movieguide® Staff

Since 1996, Section 230 of the Communications Decency Act has protected online platforms—including social media—from civil liability-based harmful content from a third party and its removal.

However, as social media has continued to grow, especially among young users, the Supreme Court could challenge the act, with a lawsuit against Google pending.

In the lawsuit, the plaintiffs allege that Section 230 shouldn’t protect online platforms and big tech companies from third-party content that is dangerous and harmful to its users, the Wall Street Journal reported.

The repeal of Section 230 could bring a new age of transparency and accountability to companies like Twitter and Google, who have had little legal pushback in years previous.

The case could also pave the way for more changes in states like Texas and Florida, who have already targeted Big Tech companies for alleged online censorship.

“This is going to be the most important [Supreme Court] term ever for the internet,” Alan Rozenshtein told WSJ. “It’s not even close.”

If the Section 230 protections were to lessen for Big Tech platforms, the algorithms that are used to push specific content to users could receive a major do-over.

Since COVID-19, the popular video-sharing site TikTok has become the most popular social site on the internet.

Unfortunately, their business model of pushing the latest trend—no matter how destructive—has resulted in the championing of immoral content, the sexual abuse of minors, and even the death of younger members.

Read More: Warning: How TikTok Live Encourages Sexual Abuse of Minors Through Cash Gifts

Read More: How TikTok’s Algorithms Show Users Potentially Harmful Content

Movieguide® previously reported:

Two former TikTok moderators are suing the video sharing app after claiming they experienced emotional trauma after seeing “highly toxic and extremely disturbing” videos every day.

TikTok moderators review videos posted on the app and determine if they break any of the site’s content rules and guidelines.

“We would see death and graphic, graphic pornography. I would see nude underage children every day,” Ashley Velez said. “I would see people get shot in the face, and another video of a kid getting beaten made me cry for two hours straight.”

Velez and another moderator, Reece Young, have filed a federal lawsuit seeking class action status against TikTok and its parent company, ByteDance.

“You see TikTok challenges, and things that seem fun and light, but most don’t know about this other dark side of TikTok that these folks are helping the rest of us never see,” said lawyer Steve Williams of the Joseph Saveri Law Firm, which filed the case.

According to the suit, Young and Velez were in an unsafe work environment because TikTok did not provide the proper mental health treatment to deal with the anxiety, depression, and post-traumatic stress they experienced after watching such disturbing content.