Arriving at a recent conference organised by one of the government's many regulatory bodies, I received my obligatory lanyard – and something else, credit-card-shaped, emblazoned with the branding for event. "What's this?" I asked.
"Oh, that's a USB key."
I presume the conference organisers mistook my wild-eyed stare of disbelief as one of benevolent gratitude and admiration for their consideration of my storage needs. Who could have thought this gift a good idea? Someone who had never heard of Stuxnet, or of any of the now-too-numerous-to-count stories of USB keys being used to infiltrate organisations, exfiltrate data, even destroy computers?
Stuxnet 'a game changer for malware defence'READ MORE
Then I wondered if I was becoming paranoid – or not paranoid enough.
Times have changed. Technology has become a multitrillion-dollar business upon which the fate of nations and whole economies now depend. Over the last 30 years, technological risk became a major consideration in national strategic risk, and once that happened it became impossible to view the operations of the technology sector through a purely economic lens.
We spend plenty of time digesting earnings reports while blithely ignoring other considerations – the politics and mechanisms of power – because they don't fit neatly onto balance sheets, leaving us open to all sorts of attacks – everything from industrial espionage to sabotage to the poisoning of algorithms to support political ends. (Hello, Facebook!)
Maybe that's because those of us with long careers in technology don't want to wear the full weight of our responsibilities that have grown as computing has become fundamental to the operation almost every process, everywhere. Where once we obsessed about "keeping things up and running", there's a tacit recognition this now means "keeping the world from imploding", a task that feels as though it becomes more difficult by the day, as others step in and work to steer things to serve their own ends.
All of that awareness landed in my hand with that USB key, at once just an innocent gift and simultaneously inspiring a dark chain of thought about provenance, chain of custody, country of production, and the economic benefits of having access to a very select set of government agencies and commercial firms represented at this conference. A rich surface ripe for attack, and this device a possible vector.
More than 87m Facebook profiles farmed, says second ex-Cambridge Analytica witnessREAD MORE
Attacks and penetrations represent costs of business in a connected world. They too have been reduced lines on a balance sheet, carefully hidden away in an accounting of "cyber" costs that banks and other financial institutions bury so that their customers continue to see them as stable, reliable and secure. Yet for as long these attacks continue – and succeed – hiding that fact may be doing us more harm than good.
Somewhere in the wide range between ignorant and terrified we have to find a new place to have a conversation about security. A little nuance could serve us and our institutions well, giving us some ways to think constructively and proactively about how we want to inform our practice as technologists with a broader awareness of this sector's importance to both economic and and national security.
A few organisations already send out fake phishing emails, offering the employees caught up in these pseudo-scams additional training (and, presumably, unbeknownst to them, additional monitoring). While a good beginning, we need a broader education in "Practical Paranoia": how to tell the difference between commercial interest and national interest; between marketing hype and political propaganda; between authentic relationship and clever manipulation. Without that training – and the techniques flowing from it – technology will remain the plaything of those who have mastered the arts of control. This industry will continue to be exposed as ignorantly serving the ends of the powerful, rather than our customers.
In an era of pervasive autonomous systems, we need to provide assurance that autonomous devices will perform as designed, will not spy or go rogue or otherwise act to destroy the confidence essential both to commercial success and to the public's perception of safety. We don't have that. We don't even have a strong sense of why we need it, operating as though we're still in the halcyon world of the mid-1990s, when there were no enemies anywhere. Perhaps a bit more paranoia would serve us well. ®