This article is more than 1 year old
GCHQ pushes for 'virtual crocodile clips' on chat apps – the ability to silently slip into private encrypted comms
Sliding into your DMs unnoticed, literally
Analysis Britain's surveillance nerve-center GCHQ is trying a different tack in its effort to introduce backdoors into encrypted apps: reasonableness.
In an essay by the technical director of the spy agency's National Cyber Security Centre, Ian Levy, and technical director for cryptanalysis at GCHQ, Crispin Robinson, the authors go out of their way to acknowledge public concerns over government access to personal communication.
They also promise to get back to a time where the authorities only use their exceptional powers in limited cases, where a degree of accountability is written into spying programs, and they promise a more open discussion about what spy agencies are allowed to do and how they do it.
But the demand for backdoors is still there, this time couched in terms of "virtual crocodile clips" on modern telephone lines, namely the encrypted chat and call apps that have become ubiquitous on smart phones.
"For over 100 years, the basic concept of voice intercept hasn’t changed much: crocodile clips on telephone lines," the authors note. "Sure, it's evolved from real crocodile clips in early systems through to virtual crocodile clips in today’s digital exchanges that copy the call data. But the basic concept has remained the same. Many of the early digital exchanges enacted lawful intercept through the use of conference calling functionality."
Strong end-to-end encryption has largely killed off the conference-call approach, but Levy and Robinson note that it still theoretically possible for companies to silently grant access to the authorities.
"It's relatively easy for a service provider to silently add a law enforcement participant to a group chat or call," they argue. "The service provider usually controls the identity system and so really decides who's who and which devices are involved – they’re usually involved in introducing the parties to a chat or call."
Such an approach would retain strong end-to-end encryption but introduce "an extra 'end' on this particular communication," they argue. And it would be "no more intrusive than the virtual crocodile clips that our democratically elected representatives and judiciary authorize today in traditional voice intercept solutions and certainly doesn’t give any government power they shouldn’t have."
In effect, the super-snoops are proposing that they be allowed to subvert a cornerstone of encrypted apps – public key verification – to eavesdrop on conversations, and that the companies that develop the apps turn a blind eye to it. Rather than crack or weaken the underlying cryptography, the spies want to warp the software and user interfaces wrapped around it to let them silently eavesdrop on conversations.
The spy agencies would be allowed to order a company to silently add government snoops to conversations, presumably turning off any notifications that alert users to the fact that a new person has been added to the chat, or an existing one changed. And the companies would in turn refrain from improving their current systems, or making public key verification more visible and user friendly.
The key thing here, no pun intended, is that agents would be added to a chat just like any another conversation partner, with the correct public-private key exchanges, except there would be no notification and no way to spot or inspect the spies' public keys.
To GCHQ's mind, this is a perfect solution: it doesn't require app developers to scale back security on their existing software, beyond crippling the user interface and notifications, natch. And, because the tapping would be at the vendor level, it would be hard for hackers and other malicious actors to exploit the same approach.
Plus, they reason, the big plus is that it would be hard to scale up to mass surveillance levels, and it wouldn't undermine encryption. It's a win-win... for the security services, anyway.
On the surface at least, this seems like a reasonable compromise that app developers could get behind. People who build their own chat software from source code they can inspect won't be too bothered by this approach, either.
The truth is that while there is no shortage of fierce privacy advocates who insist that the government should never be granted access to any private conversation, those in positions of power who receive classified briefings about how such technologies are misused are open to granting reasonable access to the communications of dangerous people.
But scratch the surface, and the same GCHQ – the one that wants access to as much information as possible and which is wholly opposed to any form of real accountability – lurks beneath.
Open and honest?
While advocating for "open and honest conversations between experts that can inform the public debate about what’s right," the authors completely ignore that fact that it was Edward Snowden's revelations that the security services had massively abused their powers to create a body of secret and highly questionable law that gave them access to pretty much everyone's communications that led to the current crop of encrypted apps.
Instead, every reference to the fact that no one trusts the spy agencies to do what they say they will do is painted as the public being hoodwinked.
"The public has been convinced that a solution in this case is impossible," the authors argue, "so we need to explain why we’re not proposing magic."
GCHQ opens kimono for infosec world to ogle its vuln disclosure processREAD MORE
Later: "Much of the public narrative on this topic talks about security as a binary property; something is either secure or it’s not. This isn’t true – every real system is a set of design trade-offs."
Also: "The public will also want to know how these systems are used, as it has been convinced that governments want access to every single one of these encrypted things."
The truth is that the spy agencies – GCHQ and the NSA in particular – have been dragged kicking and screaming to this point. Even after the scale of their misuse of systems and laws were exposed, they continued – and continue – to fight tooth-and-nail any effort to scale back their programs, reveal how they work, or add real accountability to their systems.
It is worth noting that just this week a number of organizations wrote to the US Department of Justice urging it not to authorize the UK authorities' access to American corporate data because current UK law doesn't adhere to human rights obligations and commitments.
That data sharing would happen under the CLOUD Act – the same legislation that today's blog post holds up as a great example for how to introduce global accountability in data sharing.
And then there's the fact that the European Court of Human Rights has heavily criticized the UK's approach, in particular the lack of decent oversight when it comes to bulk interception of communications.