Google Introduces Blur Feature That Blocks Explicit Images from Search Results

Photo courtesy of
Charles Deluvio
via Unsplash

Google Introduces Blur Feature That Blocks Explicit Images from Search Results

By Movieguide® Staff

Google recently unveiled a new online safety feature that will help users avoid graphic violence and pornographic images while using the search engine.

The company made the announcement during their Safer Internet Day event on Tuesday, Feb. 7.

At launch, the new feature will be the default setting and blur explicit images from search results. The online safety feature is automatically enabled whether SafeSearch is active or not.

“Unless your account is supervised by a parent, school, or administrator, you will be able to change your SafeSearch setting at any time…” Google spokesperson Charity Mhende told The Verge.

For parents, Google will allow the feature to be supervised for accounts used by children under the age of 18; blocking or monitoring user activity.

The Verge said:

SafeSearch is already the default for signed-in users under the age of 18, as it helps to filter out explicit content such as pornography, violence, and gore when using Google to search for images, videos, and websites. When the blur feature launches, it will appear as a new item within the SafeSearch menu, alongside the option to disable SafeSearch entirely and a filter option to additionally hide explicit text and links. Disabling SafeSearch entirely provides the most relevant results without hiding any content.

You can modify your SafeSearch filter by following Google’s instructions, but you’ll have to wait a while before the blur option is rolled out.

The blur filter will be a welcome feature for parents concerned with their children’s online safety.

With the rise of social media and digital use at home and in public schools, legislators and parents agree that media discernment and online safety is a primary concern today.

Movieguide® previously reported:

In the U.K., legislators are proposing the Online Safety Bill to protect children from harmful and inappropriate material.

Chris Philp, Minister for Tech and the Digital Economy, said that online tech companies would need stricter guidelines to truly protect child users.

“If platforms want children to use their services, they will need to protect them from accessing content that is harmful or inappropriate. If their services are meant for adults, they will need to prevent underage access,” he told BBC News. “Those who fail to comply, will face massive fines and risk their services being blocked from access in the U.K.”

Similar laws are in the works in the United States, as legislators from both parties band together to protect the younger generation.

Quality: - Content: +1
Quality: - Content: +3