- Apple has attempted to clarify, via a blog post, that the scanning of these images will be done so without actually seeing the image but only looking for a ‘fingerprint match’
- Additionally, Apple is also planning to scan users’ encrypted text messages for any sexually explicit content as a measure to protect children
- While celebrated by human rights organisations, security watchdogs have raised concerns over the privacy creep borne out of the rollout of the latest measures
In a move that has attracted plaudits and criticism alike, Apple announced the rollout of a new feature that will scan images and text messages on iPhones and other Apple devices for known images of child sexual abuse.Â
The software, said Apple in a statement, will “help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).â€Â
The tool, ‘neuralMatch,’ designed to scan for known images of child sexual abuses will do so before an image is uploaded onto iCloud. If it detects a match, the image will be sent for review by a human. If the image in question is found to be child pornography, the user’s account will immediately be disabled with the National Center for Missing and Exploited Children (NCMEC) notified.Â
Apple has attempted to clarify, via a blog post, that the scanning of these images will be done so without actually seeing the image but only looking for a ‘fingerprint match.’ It’s only when these matches exceed a specific threshold will the NCMEC be informed.Â
It said that the database of known child sexual abuse images is turned into ‘an unreadable set of hashes that is securely stored on users’ devices.’ Before an image is uploaded onto iCloud, the operating system will check if it matches against the known CSAM hashes using a ‘cryptographic technology called private set intersection.’Â
A ‘cryptographic safety voucher’ that includes the match, the result and other encrypted data will then be saved on iCloud along with the image. These vouchers, Apple has said, cannot be deciphered by Apple unless a certain CSAM threshold is crossed. Additionally, Apple is also planning to scan users’ encrypted text messages for any sexually explicit content as a measure to protect children.Â
Privacy creep concerns
While celebrated by human rights organisations, security watchdogs have raised concerns over the privacy creep borne out of the rollout of the latest measures. Will Cathcart, the head of WhatsApp which is known for its end-to-end encryption that disables anyone from gaining access to content shared between users of the text-messaging service, said “I think this is the wrong approach and a setback for people’s privacy all over the world.â€Â
“Can this scanning software running on your phone be error-proof? Researchers have not been allowed to find out,†he continued on Twitter.Â
Mathew Green, a prominent cryptography researcher at Johns Hopkins University, expressed similar sentiments, warning also that the tool may be used by malicious actors to frame individuals by sending harmless images that trigger child pornography matches, effectively fooling the software.Â
“Researchers have been able to do this pretty easily,†he said. He also alerted to the possibility of governments or government agencies abusing the software to quell dissidents. “What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for? Does Apple say no? I hope they say no, but their technology won’t say no,†he said.Â
It bears mentioning that tech companies like Apple, Google and Facebook are coming under increasing pressure by government and law enforcement authorities to provide access to encrypted content shared between users. Striking a balance between cracking down on criminal activity such as the sharing and consumption of child pornography and maintaining the high levels of privacy that these companies have insisted continue to exist on their software, is proving to be a challenge.Â
However, others have welcomed Apple’s latest features. John Clark, the president and CEO of the NCMEC, said in a statement, “Apple’s expanded protection for children is a game-changer.â€
“With so many people using Apple products, these new safety measures have lifesaving potential for children,†he added.