Google Implements New Policy to Obscure Explicit Images in Search Results
Google has introduced new online safety tools on Safer Internet Day in February, in an effort to address concerns about children’s online safety. One of the notable features is the expanded SafeSearch, which aims to make the internet a safer space for exploration.
Now Google has rolled out an extended SafeSearch for everyone. Here’s what we know about it.
What is SafeSearch?
According to Google, SafeSearch is an online data protection tool that acts as a filter and protects users from potentially open, inappropriate or offensive content in Google search and images. This tool is automatically enabled for users under the age of 18 to ensure their online safety while browsing the wide web. It only works for searches made via Google search, not other search engines.
Despite these measures, Google reports that around 61 percent of children this year will come across inappropriate content online at least once, compared to 54 percent last year.
Google Extended SafeSearch
Google has now further strengthened SafeSearch with additional functions to protect children online. Thanks to the new extended SafeSearch setting, it automatically blurs clear images such as graphic, violent or adult content. It is expected to act as a protection for those users who do not have SafeSearch filtering enabled. The advanced SafeSearch option obfuscates inappropriate content by default when it appears in search results.
Parents or school network administrators can lock this setting so that children cannot disable it.
Norman Ng, Google Asia-Pacific’s regional director of global engagement for trust and security, said in a blog post: “In today’s digital parenting, it’s essential to adopt technology with guardrails to ensure safe and trusted content. Together, we can help our children stay safe and thrive in the online world.”
In addition to the expanded SafeSearch, parents can also use Family Link to activate age-specific content restrictions on Google Play and Search.