Google on Tuesday unveiled a series of online safety measures for children including a private setting for videos uploaded by teens and safeguard for ads shown to users under 18.
The new features, which come amid heightened concerns about online child exploitation and safety at a time of growing internet usage during the global pandemic, affect Google’s YouTube video platform as well as its online services such as search and Google Assistant.
“As kids and teens spend more time online, parents, educators, child safety and privacy experts, and policy makers are rightly concerned about how to keep them safe,” said Google product and user experience director Mindy Brooks.
“We engage with these groups regularly, and share these concerns.”
Google’s “safe search” – which excludes sensitive or mature content – will be the default setting for users under 18, which up to now had been the case only for under-13 users.
On the massively popular YouTube platform, content from 13- to 17-year-olds will be private by default, the tech giant said.
“With private uploads, content can only be seen by the user and whomever they choose,” said a blog post by James Beser, head of product management for YouTube Kids and Family.
“We want to help younger users make informed decisions about their online footprint and digital privacy . . . If the user would like to make their content public, they can change the default upload visibility setting and we’ll provide reminders indicating who can see their video.”
Google will also make it easier for families to request removal of a child’s photos from image search requests.
“Of course, removing an image from search doesn’t remove it from the web, but we believe this change will help give young people more control of their images online,” Brooks said.
In another safety move, Google will turn off location history for all users under 18 globally, without an option to turn it back on. This is already in place for those under 13.
Google will also make changes in how it shows ads to minors, blocking any “age-sensitive” categories and banning targeting based on the age, gender or interests of people under 18.