With the rise of AI-generated content and deepfakes, Google has introduced a new tool to help users remove non-consensual explicit images from Search results. The feature applies to both manipulated and real images shared without consent.
How the Removal Tool Works
Users can remove images by clicking the three-dot menu next to a Search result and selecting “remove result.” Options include reporting images that:
-
Show sexual content of the user
-
Depict someone under 18
-
Contain personal information
For sexual images, users can specify whether the content is a deepfake or a real photo. The tool also allows multiple images to be submitted in a single request.
After submitting a request, Google provides immediate access to legal and emotional support resources. Users can also enable safeguards to filter out similar images in future Search results. However, images not reported may still remain visible to others.
Expanded “Results About You” Hub
The tool integrates with Google’s “Results About You” hub, which allows users to track removal requests. The hub now also monitors sensitive information, such as social security numbers, driver’s license details, and passport information. Users will be notified if such data appears in Search results and can request its removal.
To use the tool, individuals must provide personal contact information and government-issued ID verification. The feature is set to roll out in most countries over the coming days, starting with the United States.
Dark Web Reporting Discontinued
This update comes as Google phases out its previous dark web reporting feature, which alerted users if personal details appeared online. Google cited limited effectiveness and said the new tools offer clearer, actionable options for removing sensitive or harmful content.