EU attempt to sneak through new encryption-eroding law slammed by Signal, politicians
If you call 'client-side scanning' something like 'upload moderation,' it still undermines privacy, security
On Thursday, the EU Council is scheduled to vote on a legislative proposal that would attempt to protect children online by disallowing confidential communication.
The vote had been set for Wednesday but got pushed back [PDF].
Known to detractors as Chat Control, the proposal seeks to prevent the online dissemination of child sexual abuse material (CSAM) by requiring internet service providers to scan digital communication – private chats, emails, social media messages, and photos – for unlawful content.
The proposal [PDF], recognizing the difficulty of explicitly outlawing encryption, calls for "client-side scanning" or "upload moderation" – analyzing content on people's mobile devices and computers for certain wrongdoing before it gets encrypted and transmitted.
The idea is that algorithms running locally on people's devices will reliably recognize CSAM (and whatever else is deemed sufficiently awful), block it, and/or report it to authorities. This act of automatically policing and reporting people's stuff before it's even had a chance to be securely transferred rather undermines the point of encryption in the first place.
We've been here before. Apple announced plans to implement a client-side scanning scheme back in August 2021, only to face withering criticism from the security community and civil society groups. In late 2021, the iGiant essentially abandoned the idea.
Europe's planned "regulation laying down rules to prevent and combat child sexual abuse" is not the only legislative proposal that contemplates client-side scanning as a way to front-run the application of encryption. The US Earn-It Act imagines something similar.
In the UK, the Online Safety Act of 2023 includes a content scanning requirement, though with the government's acknowledgement that enforcement isn't presently feasible. While it does allow telecoms regulator Ofcom to require online platforms to adopt an "accredited technology" to identify unlawful content, there is currently no such technology and it's unclear how accreditation would work.
With the EU proposal vote approaching, opponents of the plan have renewed their calls to shelve the pre-crime surveillance regime.
- Arm security defense shattered by speculative execution 95% of the time
- Researchers find Meta's withdrawal of misinformation tool hard to swallow
- Microsoft Research chief scientist has no issue with Windows Recall
- Encrypted mail service Proton hands suspect's personal info to local cops
In an open letter [PDF] on Monday, Meredith Whittaker, CEO of Signal, which threatened to withdraw its app from the UK if the Online Safety Act disallowed encryption, reiterated why the EU client-side scanning plan is unworkable and dangerous.
"There is no way to implement such proposals in the context of end-to-end encrypted communications without fundamentally undermining encryption and creating a dangerous vulnerability in core infrastructure that would have global implications well beyond Europe," wrote Whittaker.
European countries continue to play rhetorical games. They’ve come back to the table with the same idea under a new label
"Instead of accepting this fundamental mathematical reality, some European countries continue to play rhetorical games.
"They’ve come back to the table with the same idea under a new label. Instead of using the previous term 'client-side scanning,' they’ve rebranded and are now calling it 'upload moderation.'
"Some are claiming that 'upload moderation' does not undermine encryption because it happens before your message or video is encrypted. This is untrue."
The Internet Architecture Board, part of the Internet Engineering Task Force, offered a similar assessment of client-side scanning in December.
Encrypted comms service Threema published its open variation on this theme on Monday, arguing that mass surveillance is incompatible with democracy, is ineffective, and undermines data security.
"Should it pass, the consequences would be devastating: Under the pretext of child protection, EU citizens would no longer be able to communicate in a safe and private manner on the internet," the biz wrote.
EU citizens would no longer be able to communicate in a safe and private manner on the internet
"The European market’s location advantage would suffer a massive hit due to a substantial decrease in data security. And EU professionals like lawyers, journalists, and physicians could no longer uphold their duty to confidentiality online. All while children wouldn’t be better protected in the least bit."
Threema said if it isn't allowed to offer encryption, it will leave the EU.
And on Tuesday, 37 Members of Parliament signed an open letter to the Council of Europe urging legislators to reject Chat Control.
"We explicitly warn that the obligation to systematically scan encrypted communication, whether called 'upload-moderation' or 'client-side scanning,' would not only break secure end-to-end encryption, but will to a high probability also not withstand the case law of the European Court of Justice," the MEPs said. "Rather, such an attack would be in complete contrast to the European commitment to secure communication and digital privacy, as well as human rights in the digital space." ®