This article is more than 1 year old
Europe proposes tackling child abuse by killing privacy, strong encryption
If we're gonna go through this again, can we just literally go back in time?
Proposed European regulations that purport to curb child abuse by imposing mass surveillance would be a "disaster" for digital privacy and strong encryption, say cybersecurity experts.
A number of options have been put forward for lawmakers to mull that aim to encourage or ensure online service providers and messaging apps tackle the "detection, removal, and reporting of previously-known and new child sexual abuse material and grooming."
These options range from voluntary detection and reporting of child sexual abuse material (CSAM) and grooming, to legally mandating that service providers find and report such material using whatever detection technology they wish — essentially scanning all private communications and, if necessary, breaking end-to-end (E2E) encryption for everyone.
If rubber-stamped, the rules will apply to online hosting services and interpersonal communication services, such as messaging apps, app stores, and internet access providers.
- Privacy is for paedophiles, UK government seems to be saying while spending £500k demonising online chat encryption
- UK.gov is launching an anti-Facebook encryption push. Don't think of the children: Think of the nuances and edge cases instead
- Apple quietly deletes details of derided CSAM scanning tech from its Child Safety page without explanation
- WhatsApp's got your back(ups) with encryption for stored messages
"If this proposal were to come to pass, it could result in countries banning true end-to-end encryption," EFF Senior Policy Analyst Joe Mullin told The Register, noting that requiring service providers to detect suspected child grooming requires them to analyze all private messages.
"The EU proposal is incompatible with end-to-end encryption and with basic privacy rights," Mullin continued. "There's no way to do what the EU proposal seeks to do, other than for governments to read and scan user messages on a massive scale. If it becomes law, the proposal would be a disaster for user privacy not just in the EU but throughout the world."
Here's what the proposal says service providers would need to do after receiving a "detection order" to scan for, report and remove any CSAM or grooming activity:
This regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders … That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
It's worth noting that this finding-and-stopping-pedophiles argument is frequently used to oppose E2E encryption and drum up support for mass-surveillance proposals — like Apple's plan to scan photos on iPhones and iPads for CSAM, which it subsequently and quietly walked back late last year.
EU 'war on E2E encryption'
"In case you missed it, today is the day that the European Union declares war upon end-to-end encryption, and demands access to every persons private messages on any platform in the name of protecting children," tweeted Alec Muffet, who architected and led Facebook Messenger's end-to-end encryption effort.
He has first-hand experience with this. The UK government's ongoing rumblings against end-to-end encryption also relies heavily on similar think-of-the-children and Facebook-harbors pedophiles rhetoric.
Matthew Green, a cryptography professor at Johns Hopkins University, called the Euro proposal "the most terrifying thing I've ever seen."
If signed into law, this regulation would likely require service providers to use AI to read entire text messages to figure out if a user is "grooming" children for sexual abuse, he added.
"It is potentially going to do this on encrypted messages that should be private. It won't be good, and it won't be smart, and it will make mistakes," he said. "But what's terrifying is that once you open up 'machines reading your text messages' for any purpose, there are no limits." ®