This article is more than 1 year old

Apple stalls CSAM auto-scan on devices after 'feedback' from everyone on Earth

Critics celebrate reconsideration of 'spyPhone' regime

Apple on Friday said it intends to delay the introduction of its plan to commandeer customers' own devices to scan their iCloud-bound photos for illegal child exploitation imagery, a concession to the broad backlash that followed from the initiative.

"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," the company said in a statement posted to its child safety webpage.

"Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

We have decided to take additional time over the coming months to collect input and make improvements

Last month, Apple announced its child safety initiative, which involves adding a nudity detection algorithm to its Messages chat client, to provide a way to control the sharing of explicit images, and running code on customer's iDevices to detect known child sexual abuse material among on-device photos destined for iCloud storage.

These features were due to debut in the public releases of iOS 15, iPadOS 15, watchOS 8, and macOS Monterey operating system software, expected later this month or next. But faced with objections from more than 90 advocacy organizations, Apple has opted to pause the rollout.

ACLU attorney Jennifer Granick via Twitter heralded the delay as a victory for civil liberties advocacy. "It's great that Apple plans to engage with independent privacy and security experts before announcing their genius plans," she said. "They should start with end to end encryption for iCloud backups."

Matthew Green, associate professor of computer science at the Johns Hopkins Information Security Institute, via Twitter urged Apple to engage with the technical and policy communities, and to talk to the public, before rolling out new technology.

"This isn’t a fancy new Touchbar," he said "It’s a privacy compromise that affects 1bn users."

Apple's declared goal, keeping children safe and preventing the distribution of illegal child sexual abuse material (CSAM), has broad support. But its approach does not. Its explicit photo intervention in its Messages app has been described as a more of a danger to children than a benefit.

And its decision to conduct CSAM scans using owner's computing resources and customer-owned hardware for the scheme has been widely characterized as an erosion of property rights and backdoor that will be used for government surveillance and control.

As NSA whistleblower Edward Snowden put it, "Apple plans to erase the boundary dividing which devices work for you, and which devices work for them."

Apple's plan also contradicts its own marketing about privacy. The Electronic Frontier Foundation, one of dozens of organizations that expressed concerns about Apple's plans, highlighted the company's reversal by citing the text of its 2019 CES billboard: "What happens on your iPhone, stays on your iPhone."

"Now that Apple has built [a backdoor], they will come," wrote EFF deputy executive director Kurt Opsahl in a post last month. "With good intentions, Apple has ​​paved the road to mandated security weakness around the world, enabling and reinforcing the arguments that, should the intentions be good enough, scanning through your personal life and private communications is acceptable."

In a statement emailed to The Register, Evan Greer, director of Fight for the Future, condemned Apple's "spyPhone" proposal.

"Apple’s plan to conduct on-device scanning of photos and messages is one of the most dangerous proposals from any tech company in modern history," she said. "Technologically, this is the equivalent of installing malware on millions of people’s devices – malware that can be easily abused to do enormous harm."

Apple’s plan to conduct on-device scanning of photos and messages is one of the most dangerous proposals from any tech company in modern history

Apple – rather than actually engaging with the security community and the public – published a list of Frequently Asked Questions and responses to address the concern that censorious governments will demand access to the CSAM scanning system to look for politically objectionable images.

"Could governments force Apple to add non-CSAM images to the hash list?" the company asked in its interview of itself, and then responded, "No. Apple would refuse such demands and our system has been designed to prevent that from happening."

Apple however has not refused government demands in China with regard to VPNs or censorship. Nor has it refused government demands in Russia, with regard to its 2019 law requiring pre-installed Russian apps.

Tech companies uniformly say they comply with all local laws. So if China, Russia, or the US were to pass a law requiring on-device scanning to be adapted to address "national security concerns" or some other plausible cause, Apple's choice would be to comply or face the consequences – it would no longer be able to say, "We can't do on-device scanning." ®

More about

TIP US OFF

Send us news


Other stories you might like