This article is more than 1 year old

Apple quietly deletes details of derided CSAM scanning tech from its Child Safety page without explanation

Non-consensual on-device image analysis perhaps was difficult to reconcile with privacy protests

Updated Apple evidently has decided against forcing customers to run its sex crime detection software on their iPhones in order to refer those stashing illegal child abuse images in iCloud to authorities.

We say "evidently" because the iTitan has simply erased the explanatory text it posted in August that describes its non-consensual image vetting system and has not responded to a request to clarify its plans.

That month, Apple announced its intention to implement Child Sex Abuse Material (CSAM) Detection [PDF], one of two initiatives that it detailed on its Child Safety webpage. The other, a parental control for its Message app to "warn children and their parents when receiving or sending sexually explicit photos," debuted in iOS 15.2, which was released on Monday.

Apple's CSAM Detection scheme prompted a deluge of criticism from advocacy groups and the computer security community because it used customers' own devices against them and shifted part of the expense of policing from Apple – which incurs a compute cost by scanning images on its iCloud servers – to its customers and their devices.

The tech giant clearly intended to shoulder some of that cost with a server-side component. But its plan also called for an on-device matching algorithm to compare the hashes of local images headed for iCloud synchronization against a database of known CSAM image hashes. Its approach is based on a cryptographic technique called private set intersection, which ostensibly can calculate whether a local iCloud-bound photo is substantially similar to a known CSAM image without locally revealing the result – for the sake of privacy.

Apple never explained why it developed this hybrid approach – starting scans on-device and relaying results for further analysis on its servers. Alex Stamos, director of the Stanford Internet Observatory and former CSO of Facebook, speculated in a since deleted tweet that the scheme might be intended to allow Apple to offer encrypted backups with user-held private keys, which would put the company at odds with locked-out law enforcement agencies unless it could assure authorities its customers weren't storing unlawful content.

Whatever its intentions, the iBiz thought it sufficient to say, more or less, "Think of the children," an appeal to human protective instinct that's difficult to challenge.

"We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)," the multibillion-dollar corporation said in its now vanished explainer.

Advocacy groups, security professionals, and other concerned parties, all at pains to emphasize that protecting children is a worthy goal, nonetheless managed a rejoinder along the lines of "Think of the consequences."

Aside from normalizing the permissionless use (a.k.a. theft) of user-owned computing resources – something the ad industry has managed – the predicted consequence was that every government and other entity with the power to do so would demand similar access to people's devices to run algorithmic content scans, or whatever, and phone home with the results.

Apple insisted it would refuse such demands, a dubious claim given that it has repeatedly done the opposite in countries like China and Russia.

Amid these implausible assurances, more than 90 advocacy groups voiced opposition to the plan. "Once this capability is built into Apple products, the company and its competitors will face enormous pressure – and potentially legal requirements – from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable," the signatories' letter warned.

NSA whistleblower Edward Snowden published a lengthy post against the "spyPhone."

Matthew Green, associate professor of computer science at the Johns Hopkins Information Security Institute, said Apple's CSAM scheme would compromise the privacy of a billion iPhone users.

This went on until early September when Apple, recognizing it had miscalculated, said it would delay deploying the technology.

"Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the iPhone maker said in a note added to its Child Safety webpage on Friday, September 3.

We have decided to take additional time to collect input and make improvements before releasing these critically important child safety features

Now that note is gone, along with any mention of Apple's non-consensual scanning plan.

In October, to make sure Apple got the message, fourteen of the world's most prominent computer security and cryptography experts issued a paper arguing against the use of client-side scanning because it creates security and privacy risks.

Green, who has followed Apple's plans closely, said via Twitter that he expects Apple will ditch client-side scanning for server-side scanning, which is likely to complicate the company's ability to offer end-to-end encryption in iCloud.

"It is amazing how close we came to a world where law enforcement runs powerful scanning algorithms on the private data stored on your personal device, and how many smart, decent people were on board with that," Green observed.

There's always next time. ®

Updated to add

Apple has briefed journalists that it hasn't completely scrapped its CSAM Detection plan: it's just still delayed from September. No word on why exactly the Child Safety webpage was quietly edited, though.

More about

TIP US OFF

Send us news


Other stories you might like