This article is more than 1 year old

Apple is about to start scanning iPhone users' devices for banned content, professor warns

For now it's child abuse material but the tech has mission creep written in

Updated Apple is about to announce a new technology for scanning individual users' iPhones for banned content. While it will be billed as a tool for detecting child abuse imagery, its potential for misuse is vast based on details entering the public domain.

The neural network-based tool will scan individual users' iDevices for child sexual abuse material (CSAM), respected cryptography professor Matthew Green told The Register today.

Rather than using age-old hash-matching technology, however, Apple's new tool – due to be announced today along with a technical whitepaper, we are told – will use machine learning techniques to identify images of abused children.

"What I know is that it involves a new 'neural matching function' and this will be trained on [the US National Centre for Missing and Exploited Children]'s corpus of child sexual abuse images. So I was incorrect in saying that it's a hash function. It's much more powerful," said Green, who tweeted at length about the new initiative overnight.

"I don't know exactly what the neural network does: can it find entirely new content that "looks" like sexual abuse material, or just recognize exact matches?" the US Johns Hopkins University academic told El Reg.

Indiscriminately scanning end-user devices for CSAM is a new step in the ongoing global fight against this type of criminal content. In the UK the Internet Watch Foundation's hash list of prohibited content is shared with ISPs who then block the material at source. Using machine learning to intrusively scan end user devices is new, however – and may shake public confidence in Apple's privacy-focused marketing.

Apple infamously refuses to talk to The Register, so asking it to comment on this is a fruitless exercise. Doubtless Cupertino will point to its scanning of (deliberately) unencrypted iCloud backups as precedent for this, saying it's just an incremental step in the ongoing fight against the true evil of child sexual exploitation. Nonetheless, we've asked the fruity firm to comment and faithfully promise here to reproduce their response for the delight and delectation of El Reg's readership.

Governments in the West and authoritarian regions alike will be delighted by this initiative, Green feared. What's to stop China (or some other censorious regime such as Russia or the UK) from feeding images of wanted fugitives into this technology and using that to physically locate them?

"That is the horror scenario of this technology," said Green. "Apple is the only service that still operates a major E2EE service in China, in iMessage. With this technology public, will China demand that Apple add scanning capability to iMessage? I don't know. But I'm sure a lot more worried about it than I was two days ago."

According to Green, who said he had spoken to people who had been briefed about the scheme, the scanning tech will be implemented in a "two party" design. As he explained it: "Apple will hold the unencrypted database of photos (really the training data for the neural matching function) and your phone will hold the photos themselves. The two will communicate to scan the photos on your phone. Alerts will be sent to Apple if *multiple* photos in your library match, it can't just be a single one."

The privacy-busting scanning tech will be deployed against America-based iThing users first, with the idea being to gradually expand it around the world as time passes. Green said it would be initially deployed against photos backed up in iCloud before expanding to full handset scanning.

If this is the future of using Apple devices, it might not only be sex offenders who question Apple's previously-stated commitment to protecting user privacy.

Updated on 9 August to add:

Apple announced its "child safety measures" here. ®

More about

TIP US OFF

Send us news


Other stories you might like