Don't miss the latest stories
Apple Confirms It Will Scan Existing Photos On iCloud For Child Abuse Imagery
By Alexa Heah, 10 Aug 2021
Subscribe to newsletter
Like us on Facebook
Image via Drop of Light / Shutterstock.com
Last week, Apple unveiled plans to strengthen its devices’ child safety features, including updates to Siri and Search, and new tools for parents to prevent their children from sending and receiving explicit messages.
One of its new offerings, which scans iPhone and iCloud photos for any child abuse material, has worried privacy advocates, who fear the technology could be used for surveillance.
In a new statement, the tech giant said it may only begin scanning iCloud Photos libraries for potential child abuse imagery later in 2021, though it has confirmed that the system won’t be limited to new uploads.
It clarified that the scanning and recognition will take place on the user’s device itself, using unreadable hashes of known child abuse imagery to compare them with images in the user’s iCloud.
“This set of image hashes is based on images acquired and validated to be child sex abuse material (CSAM) by child safety organizations. Apple is able to use these hashes to learn only about iCloud Photos accounts that are storing collections of photos that match to these known CSAM images, and is then only able to learn about photos that are known CSAM, without learning or seeing any other photos,” Apple explained.
As reported by SlashGear, a spokesperson from the company told CNBC that images that have already been uploaded to iCloud Photos will also be scanned. However, photo libraries not uploaded to iCloud Photos will not be examined, and “the system does not work for users who have iCloud Photos disabled.”
In response to privacy concerns, Apple said the company doesn’t add to the existing CSAM hashes, which are created and validated by external experts. It confirmed that reports will only be made to the authorities after the content is verified as CSAM by the company, and also promised to refuse any potential requests for surveillance.
“Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups,” Apple said.
Despite the company’s reassurance, it’s unclear if this explanation is likely to quell fears of possible future surveillance.
[via SlashGear, cover image via Drop of Light / Shutterstock.com]
Receive interesting stories like this one in your inbox
Also check out these recent news