Microsoft Is Aiding Deepfake Porn Victims In Wiping Out Content From Bing
By Mikelle Leow, 05 Sep 2024
Illustration 43452770 © 3quarks | Dreamstime.com
Microsoft has launched a new tool to help victims remove non-consensual intimate imagery (NCII) from Bing search results. Announced on September 5, 2024, this initiative partners with StopNCII to combat the growing issue of deepfake porn.
The rise of generative AI has made deepfake porn increasingly realistic, causing significant harm to victims. Microsoft’s approach allows individuals to create a digital fingerprint, or “hash,” of their explicit images through StopNCII. This hash is then used to detect and remove matching images from Bing’s search results.
Affected users can request to take down sexually explicit images of themselves that were uploaded without their permission via its all-in-one reporting portal. The tech giant asserts that it will delete material that’s been directly flagged by global users, NGOs, and other partners.
“In search, we also continue to take a range of measures to demote low quality content and to elevate authoritative sources, while considering how we can further evolve our approach in response to expert and external feedback,” adds Courtney Gregoire, Microsoft’s chief digital safety officer.
Gregoire emphasizes that intimate image abuse disproportionately affects women and girls. A pilot program has already addressed nearly 269,000 explicit images using StopNCII's database.
This partnership aligns Microsoft with other major platforms like Facebook, Instagram, and TikTok, which also use StopNCII’s digital fingerprints. On the flip side, Google Search has reportedly faced internal criticism for not partnering with StopNCII, despite offering its own tools to report and remove explicit images.
The issue remains complex, as the US lacks comprehensive federal legislation addressing AI-generated non-consensual imagery. Microsoft’s initiative represents a step towards protecting individuals from the unauthorized sharing of intimate images, though broader solutions are still needed to fully address this problem.
[via TechCrunch and Neowin, cover illustration 43452770 © 3quarks | Dreamstime.com]