Image via Meta
Meta has teamed up with a UK nonprofit to create a tool for users to submit hashes (or unique identifiers) of unwanted or non-consensual explicit images to a database.
From which, social media sites such as Facebook and Instagram can detect if the specific images have been shared, and remove them when possible.
This new tool, called
StopNCII, will be open to users over the age of 18 who suspect that nude images of them may have been uploaded to social media without consent, which is more commonly known as nonconsensual intimate imagery (NCII) or, in colloquial terms, revenge porn.
Previously, in 2017, Facebook had come up with an initiative to stop the circulation of revenge porn on its site, but the system drew criticism and ire from the public. It was set up so that users had to submit sexually explicit images of themselves to Messenger, so that the site could check if they’d been shared without consent. Understandably, many weren’t comfortable with the idea of trusting a network with the same images they were trying to get removed.
According to Protocol, this time, the tool will be run by the UK Revenge Porn Helpline instead of Meta itself, and will allow users to create hashes directly on their own devices, instead of asking them to upload the images onto a separate server.
“While participating companies use the hash they receive from StopNCII.org to identify images that someone has shared or is trying to share on their platforms, the original image never leaves the person’s device. Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms,” said Antigone Davis, Facebook’s Global Head of Safety.
“This feature prevents further circulation of that content and keeps those images securely in the possession of the owner,” she explained.
[via
Protocol, cover image via
Meta]