Security

Apple is about to start scanning iPhone users' devices for banned content, professor warns

For now it's child abuse material but the tech has mission creep written in


Updated Apple is about to announce a new technology for scanning individual users' iPhones for banned content. While it will be billed as a tool for detecting child abuse imagery, its potential for misuse is vast based on details entering the public domain.

The neural network-based tool will scan individual users' iDevices for child sexual abuse material (CSAM), respected cryptography professor Matthew Green told The Register today.

Rather than using age-old hash-matching technology, however, Apple's new tool – due to be announced today along with a technical whitepaper, we are told – will use machine learning techniques to identify images of abused children.

"What I know is that it involves a new 'neural matching function' and this will be trained on [the US National Centre for Missing and Exploited Children]'s corpus of child sexual abuse images. So I was incorrect in saying that it's a hash function. It's much more powerful," said Green, who tweeted at length about the new initiative overnight.

"I don't know exactly what the neural network does: can it find entirely new content that "looks" like sexual abuse material, or just recognize exact matches?" the US Johns Hopkins University academic told El Reg.

Indiscriminately scanning end-user devices for CSAM is a new step in the ongoing global fight against this type of criminal content. In the UK the Internet Watch Foundation's hash list of prohibited content is shared with ISPs who then block the material at source. Using machine learning to intrusively scan end user devices is new, however – and may shake public confidence in Apple's privacy-focused marketing.

Apple infamously refuses to talk to The Register, so asking it to comment on this is a fruitless exercise. Doubtless Cupertino will point to its scanning of (deliberately) unencrypted iCloud backups as precedent for this, saying it's just an incremental step in the ongoing fight against the true evil of child sexual exploitation. Nonetheless, we've asked the fruity firm to comment and faithfully promise here to reproduce their response for the delight and delectation of El Reg's readership.

Governments in the West and authoritarian regions alike will be delighted by this initiative, Green feared. What's to stop China (or some other censorious regime such as Russia or the UK) from feeding images of wanted fugitives into this technology and using that to physically locate them?

"That is the horror scenario of this technology," said Green. "Apple is the only service that still operates a major E2EE service in China, in iMessage. With this technology public, will China demand that Apple add scanning capability to iMessage? I don't know. But I'm sure a lot more worried about it than I was two days ago."

According to Green, who said he had spoken to people who had been briefed about the scheme, the scanning tech will be implemented in a "two party" design. As he explained it: "Apple will hold the unencrypted database of photos (really the training data for the neural matching function) and your phone will hold the photos themselves. The two will communicate to scan the photos on your phone. Alerts will be sent to Apple if *multiple* photos in your library match, it can't just be a single one."

The privacy-busting scanning tech will be deployed against America-based iThing users first, with the idea being to gradually expand it around the world as time passes. Green said it would be initially deployed against photos backed up in iCloud before expanding to full handset scanning.

If this is the future of using Apple devices, it might not only be sex offenders who question Apple's previously-stated commitment to protecting user privacy.

Updated on 9 August to add:

Apple announced its "child safety measures" here. ®

Send us news
364 Comments

Meta privacy red team lead: Does your business know its privacy adversaries?

Ethical hackers, but for privacy programs

Warning: Apple 'could very easily' cripple Jamf

Tech watchdog group worries iGiant will 'Sherlock' device manager

Apple says 2017 MacBooks don't have FlexGate defect. Aussie tribunal orders a fix anyway

Presiding officer was not impressed by Cupertino's arguments or behavior

Google tells Apple to 'fix text messaging' in bid to promote RCS protocol

iMessage talks to Android users via outdated SMS/MMS, ad giant complains

Apple tells suppliers to use 'Taiwan, China' or 'Chinese Taipei' to appease Beijing

That's the way the Cook he crumbles

Meta iOS apps accused of injecting code into third-party websites

Company insists it's doing so 'to honor people’s App Tracking Transparency (ATT) choices'

FTC ponders proper punishment for commercial data 'surveillance' and shoddy security

Got thoughts on the online panopticon? The FTC wants to hear

Apple to compel workers to spend '3 days a week' in the office

Mandate starts in September, staff say its about 'fear of worker autonomy'

Let there be ambient light sensing, without fear of data theft

Six years on web devs finally settle on sensor privacy defenses

Apple sued by French media over App Store power

Gros fromages take on big Pomme

GitHub courts controversy by suspending Tornado Cash developers and reneging on cookie commitments

If you're looking for free speech or privacy, move along

South Korean regulator worried Apple, Google, may be working around app store payment choice law

Feels some actions may amount to coercion to stick with giants' own payment schemes