Dear Europe, here again are the reasons why scanning devices for unlawful files is not going to fly

Antivirus-but-for-pictures would trample rights, not even work as expected, say academics


While Apple has, temporarily at least, backed away from last year's plan to run client-side scanning (CSS) software on customers' iPhones to detect and report child sexual abuse material (CSAM) to authorities, European officials in May proposed rules to protect children that involve the same highly criticized approach.

The European Commission has suggested several ways to deal with child abuse imagery, including scanning online private communication and breaking encryption. It has done so undeterred by a paper penned last October by 14 prominent computer scientists and security experts dismissing CSS as a source of serious security and privacy risks.

In response, a trio of academics aims to convey just how ineffective and rights-violating CSS would be to those who missed the memo the first time around. And the last time, and the time before that.

In an ArXiv paper titled "YASM (Yet Another Surveillance Mechanism)," Kaspar Rosager Ludvigsen and Shishir Nagaraja, of the University of Strathclyde, and Angela Daly, of the Leverhulme Research Center for Forensic Science and Dundee Law School, in Scotland, revisit CSS as a way to ferret out CSAM and conclude the technology is both ineffective and unjustified.

Client-side scanning in this context involves running software on people's devices to identify unlawful images – generally those related to the exploitation of children but EU lawmakers have also discussed using CSS to flag content related to terrorism and organized crime.

Apple's approach involved using its NeuralHash machine-learning model to compute an identifier for images set to be synced to iCloud against a list of known CSAM identifiers. And it didn't fare all that well when security researchers found they could create hash collisions with non-CSAM images. European officials haven't settled on a specific technical approach, but as far as the paper's authors are concerned, CSS isn't fit for the task.

Ludvigsen, Nagaraja, and Daly argue that CSS can no more prevent the distribution of CSAM than antivirus scanning can prevent the distribution of malware.

Even if you assume, they argue, that a CSS system caught all CSAM it encountered – an unrealistic assumption – there's no clear definition of CSAM. There's a legal definition, they say, but this cannot be translated into rules for a CSS system.

So adversaries will respond to CSAM scanning by finding ways to craft images that evade detection.

"CSS contains in its very notion constant surveillance upon the system, and unlike pure logging, attempts to oversee all events within a given framework," the boffins explain. "This makes it very similar to software like antivirus, which we cannot be 'perfect' as the definition of malicious software can never define all the types in existence."

What's more, the researchers claim the cost of trying to solve CSAM far outweighs the benefits, and that's likely to be the case regardless of how the technology evolves. Presumably there would be some benefit to finding CSAM images loaded onto phones by child exploiters unaware that their devices now surveil for the state, but these would be overshadowed by violating other people's privacy rights all the time and denying everyone the benefits of encryption.

"Surveillance systems are well known to violate rights, but CSS present systems which will do this routinely or constantly, which is why we find them to be dangerous and cannot justify [them] by the goals they aim to serve," the computer scientists argue.

They are, however, convinced that EU legislators will attempt to move forward with some sort of CSAM scanning scheme, so they've also attempted to explain the legal problems they expect will follow.

"We find that CSS systems will violate several rights within the European Convention of Human Rights, but our analysis is not exhaustive," the researchers state in their paper. "They will likely violate the Right to a Fair Trial, in particular the Right to Remain Silent and Not Incriminate Oneself, Right to Privacy, and if implemented further than current examples, Freedom of Assembly and Association as well."

For example, a trial cannot be fair, the researchers argue, if defendants cannot easily challenge evidence produced by an undisclosed algorithm. There's always the possibility that the imagery may have been planted by authorities or fabricated or downloaded as a result of entrapment.

The authors go on to chide the European Commission for the techno-solutionist belief that CSS is the only possible way to combat CSAM. The Commission, they say, "disregards and does not analyze the potential consequences either CSS or server-side scanning would have on cybersecurity and privacy, while they justify the victim’s potential positive outcomes outweighing the negative of everyone else."

The researchers conclude that CSS is just too disruptive.

"If you want to dig for gold, you predict accurately where it is," they say. "What you usually do not do, is to dig up the entire crust of the surface of the earth. CSS systems and mass surveillance represent the latter." ®


Other stories you might like

  • Brave roasts DuckDuckGo over Bing privacy exception
    Search biz hits back at 'misleading' claims, saga lifts lid on Microsoft's web tracking advice

    Brave CEO Brendan Eich took aim at rival DuckDuckGo on Wednesday by challenging the web search engine's efforts to brush off revelations that its Android, iOS, and macOS browsers gave, to a degree, Microsoft Bing and LinkedIn trackers a pass versus other trackers.

    Eich drew attention to one of DuckDuckGo's defenses for exempting Microsoft's Bing and LinkedIn domains, a condition of its search contract with Microsoft: that its browsers blocked third-party cookies anyway.

    "For non-search tracker blocking (e.g. in our browser), we block most third-party trackers," explained DuckDuckGo CEO Gabriel Weinberg last month. "Unfortunately our Microsoft search syndication agreement prevents us from doing more to Microsoft-owned properties. However, we have been continually pushing and expect to be doing more soon."

    Continue reading
  • Spain, Austria not convinced location data is personal information
    Privacy group NOYB sues to get telcos to respect GDPR data access rights

    Some authorities in Europe insist that location data is not personal data as defined by the EU's General Data Protection Regulation.

    EU privacy group NOYB (None of your business), set up by privacy warrior Max "Angry Austrian" Schrems, said on Tuesday it appealed a decision of the Spanish Data Protection Authority (AEPD) to support Virgin Telco's refusal to provide the location data it has stored about a customer.

    In Spain, according to NOYB, the government still requires telcos to record the metadata of phone calls, text messages, and cell tower connections, despite Court of Justice (CJEU) decisions that prohibit data retention.

    Continue reading
  • India extends deadline for compliance with infosec logging rules by 90 days
    Helpfully announced extension on deadline day

    Updated India's Ministry of Electronics and Information Technology (MeitY) and the local Computer Emergency Response Team (CERT-In) have extended the deadline for compliance with the Cyber Security Directions introduced on April 28, which were due to take effect yesterday.

    The Directions require verbose logging of users' activities on VPNs and clouds, reporting of infosec incidents within six hours of detection - even for trivial things like unusual port scanning - exclusive use of Indian network time protocol servers, and many other burdensome requirements. The Directions were purported to improve the security of local organisations, and to give CERT-In information it could use to assess threats to India. Yet the Directions allowed incident reports to be sent by fax – good ol' fax – to CERT-In, which offered no evidence it operates or would build infrastructure capable of ingesting or analyzing the millions of incident reports it would be sent by compliant organizations.

    The Directions were roundly criticized by tech lobby groups that pointed out requirements such as compelling clouds to store logs of customers' activities was futile, since clouds don't log what goes on inside resources rented by their customers. VPN providers quit India and moved their servers offshore, citing the impossibility of storing user logs when their entire business model rests on not logging user activities. VPN operators going offshore means India's government is therefore less able to influence such outfits.

    Continue reading

Biting the hand that feeds IT © 1998–2022