Microsoft Takes Action Against Deepfake Porn: Bing Search Tool Empowers Victims

With the rise of generative AI comes a surge in deepfake porn and synthetic nude images. To combat this, Microsoft has introduced a tool for victims to remove explicit content from Bing search results. By partnering with StopNCII, Microsoft is using digital fingerprints, or “hashes,” to prevent the spread of revenge porn.

Microsoft’s Collaboration with StopNCII: How It Works

Through this partnership, Microsoft allows victims to create digital fingerprints of explicit images on their devices. These hashes are then used to stop the spread of the images across Bing and other platforms like Facebook and Instagram. In a pilot program, Bing removed over 268,000 explicit images, proving the efficacy of this new tool.

Google’s Missed Opportunity

While Google Search provides tools to report explicit content, it has not yet partnered with StopNCII, leading to criticism over its approach. Victims of deepfake revenge porn continue to call for Google to adopt similar technology.

The Legal and Social Impacts of Deepfake Porn

The lack of a U.S. federal law on AI deepfake porn complicates the issue further, as minors are now becoming targets. 23 states have passed laws, but more comprehensive action is needed.