The advancement of generative artificial intelligence tools has brought with it a new problem on the internet: the proliferation of synthetic nude images that resemble real people. In response to this issue, Microsoft has taken a significant step to support victims of such abuse, announcing a new tool that allows explicit images to be removed from its Bing search engine. This announcement was made last Thursday, in collaboration with the organization StopNCII.
Collaboration with StopNCII
Microsoft has partnered with StopNCII (Stop Non-Consensual Intimate Images), an organization dedicated to helping victims of non-consensual pornography. StopNCII offers a tool that allows affected individuals to create a “digital fingerprint” of explicit images, whether real or synthetic. This digital fingerprint, technically known as a "hash," is used by associated platforms to prevent the images from being shared or found on their services.
Through this collaboration, Bing joins other tech giants like Facebook, Instagram, TikTok, Reddit, and Pornhub, who already use this technology to combat the spread of non-consensual pornography.
Results and Remaining Challenges
In a blog post, Microsoft shared that, as of August, it had already taken action on over 268,000 explicit images found in Bing’s search results, thanks to a pilot phase with StopNCII’s database. While they previously had a direct reporting tool, the company acknowledged that this was not sufficient to effectively tackle the issue. According to Microsoft, “user reporting alone does not scale enough to prevent these images from being accessible through search.”
The problem of deepfake porn is not exclusive to Bing. Google, the most popular search engine, also faces criticism for not forming a partnership with StopNCII, despite offering tools to report and remove such content. Additionally, countries like South Korea have reported thousands of links to unwanted sexual content, exacerbating the situation.
This phenomenon, fueled by the use of AI, is spreading uncontrollably, and although StopNCII's tools only work for adults, teenagers are also being affected. Currently, the United States lacks a national law to regulate this issue, leaving responsibility to local and state authorities.