This article is more than 1 year old
What would sustainable security even look like?
Clue: Nothing like what’s on offer today
Opinion "There seems to be something wrong with our bloody ships today," fumed Admiral David Beatty during 1916's Battle of Jutland. Fair enough: three of the Royal Navy's finest vessels had just blown up and sank.
It was all the worse because Beatty had promoted the policy of spending enormous sums of cash on the Navy, to keep it numerically and technologically way ahead of the rest of the world. Jutland was supposed to prove the wisdom of this, crushing Germany's navy with imperious ease. "Bang-bang, glug-glug" was the strategy, but the Royal Navy wanted to dole it out, not receive it. Oops.
Nearly a century on, and the modern-day Beattys in charge of the Jutland of cybersecurity should be equally apoplectic about our bloody systems. "Bang-bang, glug-glug" goes Microsoft's Azure AD, as a Chinese outfit makes off with a private key – and access to boatloads of US government emails and who knows what else. "Bang-bang, glug-glug" goes the employee payroll data of some 20 million workers at 400 organizations thanks to a supply chain hack. Russian actors this time. Just another week in cybersecurity.
But today's challenge actually far more serious than any naval engagement. Jutland ended as a draw on the day and a strategic success for the Brits. It revealed systematic failures in policy, ship design, and tactical execution from which lessons were learned. It seems impossible to draw any lessons from the ongoing failure of cybersecurity to cybersecure, except that the near quarter-trillion dollar and growing global spend isn't turning the tide. If one good shot can blow an organization open, where's the money going? More pertinently, why don't more people care?
Fiddling while Rome burns
A better analogy, and one that may be fruitful, is climate change. The scale of the problem is too big to think about, fatigue is setting in – an apparent drop in attacks on smaller businesses in the UK is being put down to fewer people bothering to report them – and those in charge saying one thing while doing another. If that's politically acceptable in climate policy while large parts of the world are literally burning during the hottest month on record, where will the political will come from to fixing the much more abstruse problems with cybersecurity?
In fact, the systemic failure here is worse than that in climate change mitigation. We at least know the technologies and policies that can replace the primary causes of anthropogenic warming. Nobody knows what the alternative is for the toxicities in data safety. Nobody talks about sustainable security. It's expensive and inconvenient to decarbonize your life or your business, but it can be done and increasingly is. There's nowhere to go with cybersecurity even if your vendor has just compromised its major clients by having its key swiped.
Let the history of climate awareness help here. Characterize the problem: greenhouse gases are a byproduct of an economy wedded to cheap energy in one case, and in the other single points of failure in an industry wedded to huge datasets in under-engineered systems. In both cases, the infrastructure providers effortlessly accrue ever greater profits while sticking to business models that cause the problems. While this remains true, the problems are not being fixed.
Decarbonizing is easy to imagine, and very hard to achieve. De-zero-pay-daying by removing single points of failure is much harder to even think about, if we continue to think in silos.
Aerospace, medicine and other life-critical systems understand redundancy, fail-safe and failover systems. We think of these concepts as very expensive and time-consuming to design, build and test. So? In a quarter trillion dollar market that's not working, what does too expensive mean?
- World's most internetty firm tries life off the net, and it's sillier than it seems
- The number's up for 999. And 911. And 000. And 111
- In the battle between Microsoft and Google, LLM is the weapon too deadly to use
- China crisis is a TikToking time bomb
We have the technologies to make data safer, should we choose to apply them. They encourage plurality in design to enhance redundancy, constant research, sharing of ideas, strong regulation based on engineering best practices, audit trails, rigorous testing, and a culture of parsimony and skepticism. You can't eliminate zero-days in code any more than stuck valves on spaceships; you can make sure they don't crash the mission.
It’s easy to see why these notions are ignored in a market that’s been sold on the idea of buying into a single ecosystem, of gorging on ever more data, of faith in facile explanations instead of demonstrably adequate – let alone good – engineering, of constant lobbying against and resistance to regulation instead of engaging with it constructively.
It's also easy to see why open source, in theory immune from commercial pressures to pervert security engineering, may also be immune to the sort of industry-wide discipline and focus needed to make sustainable security work.
Academia too has dropped the ball. There are many conceptual problems in moving best practice to the global market. There's no model of optimal data safety in contemporary IT, nor even a sniff of one. Yet the ivory towers are replete with a cadre of deep thought on climate, so if the analogy is even slightly useful, there's a lot to learn and re-apply.
It's no wonder that the infrastructure and security industry, left to its own devices, does little better than the carbon czars' execrable "blue hydrogen" and "clean coal." It is in any established industry's interests to draw attention away from effective alternatives, no matter how necessary. If we are to win the security wars, we must draw up the maps of the new territory, then resolve to march there. Otherwise, the spirit of Admiral Beatty will be with us as we too sink beneath the waves.
Bang-bang, glug-glug? No thanks. ®