Google has announced new features to protect underage users, including the ability for parents to remove images of their children from search results.
Announced on 10 August, the features aim to limit access to young people’s information and imagery while also protecting them from age-sensitive material.
“We’re committed to building products that are secure by default, private by design, and that put people in control,” General Manager for Kids and Family, Mindy Brooks, wrote in a blog post.
How does Google plan to protect children users?
In the coming weeks, Google will introduce the option for under-18 users, as well as their parents or guardians, to request that Google remove their images from search results.
The option will be available alongside the existing removal request options such as non-consensual imagery or sensitive information.
At the same time, Google will introduce other changes to Google Accounts for under-18 users.
On YouTube, it will automatically set video uploads available for teenagers aged 13 to 17 to private by default.
For Search, it will turn on its SafeSearch feature for users and make it a default setting for new underage accounts.
It will also limit the setting to turn on Location History for those users. This means all under-18 users globally will not have the option to turn it on.
Meanwhile, Google will expand its safeguards on its advertisements so that age-sensitive material cannot reach underage users. It will also block ad targeting based on age, gender, and interests to users under 18.
Google is not the only tech giant to roll out new features designed to protect underage users. Last month, Instagram announced it would set accounts of under-16 users to private by default.
It also modified its ad policies in regard to what advertisements could reach those users.