Google has introduced a new safety feature that allows minors under the age of 18 to request that photographs of themselves be removed from the company’s search results. The feature was first introduced in August (together with new limitations on advertising to children), but it is now generally available.
From this website, anybody may begin the removal procedure. Applicants must include the URLs of photos they want deleted from search results, the search phrases that brought those images up, the minor’s name and age, and the identity and relationship of anybody acting on their behalf — such as a parent or guardian.
As is often the case with these types of removal requests, it’s difficult to predict exactly what factors Google will use to make its decisions. “With the exception of situations of compelling public interest or newsworthiness,” the firm said it will erase photographs of children. As we’ve seen with problematic instances using the EU’s “right to be forgotten” statute, interpreting how these phrases apply in diverse scenarios is tricky.
Google’s phrasing also suggests that it won’t cooperate with requests unless the individual in the photograph is currently under the age of 18. So, if you’re 30, you won’t be able to apply to have photos of yourself taken when you were 15. This restricts the tool’s ability to prevent misuse or harassment, but it supposedly simplifies the verification process. It’s more difficult to verify your age in a photo than it is to establish your age right now.
Google also emphasizes that removing a picture from its search results does not, of course, mean that it is no longer available on the internet. Those going through the application procedure are encouraged to contact the webmaster directly. In circumstances when this isn’t possible, deleting information from Google’s index is the next best option.
In addition to these additional alternatives for requesting the removal of photographs of children, Google currently provides other options for seeking the removal of certain categories of dangerous content. Non-consensual sexual images, bogus pornography, financial or medical information, and “doxxing” information such as home addresses and phone numbers are just a few examples.