This article is more than 1 year old
In an arms race with criminals to protect our privacy, it's too early to admit defeat
Government agents that really want your messages, though? Well, that can be another story
Register Debate Welcome to the latest Register Debate in which writers discuss technology topics, and you the reader choose the winning argument. The format is simple: we propose a motion, the arguments for the motion will run this Monday and Wednesday, and the arguments against on Tuesday and Thursday. During the week you can cast your vote on which side you support using the poll embedded below, choosing whether you're in favor or against the motion. The final score will be announced on Friday, revealing whether the for or against argument was most popular.
This week's motion is: In the digital age, we should not expect our communications to remain private.
Security pro Dave Cartwright is our first contributor arguing AGAINST the motion.
We should not expect our communications to remain private, it is asserted. And my response is: why the hell not?
One does, of course, have to ask what precisely we mean by our communications. Should we expect our employer not to snoop on our email inbox? No, we shouldn't, as long as they do it reasonably, because it's their mailbox, not ours.
But what about, say, communications containing our personal data between the HR department and the company's lawyers? Yes, we should absolutely be entitled to expect those to be confidential. And in our non-work life, how about our private mailboxes? Our personal files? Our family photos? Yes, absolutely, we should be allowed to expect those to be private.
We all know that cyber-criminals and the tech they use get better just as quickly as the mechanisms we use to defend the privacy of our data – which makes it difficult to enforce privacy. But to say we can't expect privacy is too close to admitting defeat.
We can't guarantee the privacy of your data equates to we can't be bothered
"We can't guarantee the privacy of your data" equates to "we can't be bothered," or "it's too expensive and will hit the profits for the year" – both of which come in the category of "not good enough."
And what about law enforcement? Is it reasonable for hardware and software vendors to build backdoors in their technology so that the police can use them to catch child abusers, gangsters, and drug dealers?
No, it's not reasonable, because the number of times these backdoors will be abused – either by corrupt cops or people who've compromised the law enforcement agencies' systems – is non-zero.
One of my favorite YouTube clips features Private Eye editor Ian Hislop pointing out that the courts sometimes convict the wrong people and hence it'd be dangerous to bring back capital punishment. The same applies, albeit generally less fatally, to building backdoors into security systems: in a small but non-zero number of cases, there will be unintended consequences.
Building backdoors into security systems: in a small but non-zero number of cases, there will be unintended consequences
The downside is that it's harder to catch criminals, but in the interests of maintaining privacy I think we have to accept that … and the likes of Apple seem to agree. And if we have to take our own steps to defend our data and communications over and above what our employers and service providers give us, well so be it.
There's a problem though: in the UK the law already dents our expectation in the form of the Regulation of Investigatory Powers Act, part three of which lets the authorities force disclosure of encryption keys and the like. It reads: "A disclosure requirement in respect of any protected information is necessary on grounds falling within this subsection if it is necessary: (a) in the interests of national security; (b) for the purpose of preventing or detecting crime; or (c) in the interests of the economic well-being of the United Kingdom."
Maybe I'm expecting too much, then. I'll just have to temper my expectation with a grudging acknowledgement that I'll have to trust the law enforcement bods to use the law fairly. ®
Dave Cartwright is based in Jersey and is a head of IT risk and security in the banking industry. A former chairman (and now treasurer) of the Jersey branch of the BCS, he is also deputy chair of the Channel Islands Information Security Forum and chair of the Jersey Charitable Skills Pool, which enables local cyber professionals to offer free advice to not-for-profits and charities. Outside the day job he also chairs a youth charity.
Cast your vote below. We'll close the poll on Thursday night and publish the final result on Friday. You can track the debate's progress here.