Google platform, YouTube is often criticized in the field of child protection, announced that the company seeks to enhance security measures for minors on the Internet by taking measures about personal data and contents that may not be appropriate.
Mindy Brooks, general manager of Google's Children and Families division, said that as young people spend more time online, more and more are seeking to ensure safe surfing for children and teens, including parents, teachers, privacy experts, and political officials. And they are right. We are in constant contact with them to regularly adapt our products and control tools aimed at the young audience.
For example, recordings posted on YouTube by teenagers between the ages of thirteen and seventeen will automatically become in a special format. If this feature is not disabled, only users selected by the registrant will be able to view the video.
It is also possible for minors or their parents to request that their photos be removed from the search results on the Google Photos section. The issue of removing problematic content in general, from false information to offensive images, at the request of the authorities or individuals, is a matter of contention for the platform.
In the case of geodata, the history of past geolocations will be invalidated for everyone under the age of 18 worldwide, without the ability to play it again. To protect them from inappropriate content, Google will enable the “Safe Search” feature on its search engine for all minors.
Promoters will no longer have the right to target minors with advertisements based on their age, gender, or interests.
Despite all the efforts made to prevent child pornography, applications popular with young people, such as “Snapchat”, “Tik Tok”, “YouTube” and “Instagram”, are often criticized for not doing enough to protect these people from their predators.