So why haven't the capabilities for each successive generation of devices increased exponentially?
In practice, the growing sophistication of software has meant that while computers certainly feel faster than they did thirty or forty years ago, the difference - as far as our perceptions are concerned - isn’t nearly as great. Files might load a thousand times faster, but we still experience a perceptible interval between selecting “Open...” and being able to work on a file.
Thirty years ago, Ted Nelson, one of the great visionaries of computing, said that our devices had to deliver a ‘bingo effect’ - as soon as you reached out for a document, it should be there, ready to edit. Today we open a document in Microsoft Word - even on a multi-Ghz machine with a solid state disk and plenty of RAM - in a process that always takes a few seconds. And it always has. Sure, it takes a few less seconds than it may have back in 1986, using Microsoft Word on the first Macintosh Plus, but where’s that thousand-fold improvement from Moore’s Law?
A decade ago virtual reality pioneer Jaron Lanier noted the complexity of software seems to outpace improvements in hardware, giving us the sense that we’re running in place. Our computers, he argued, have become more complex and less reliable. We can see the truth of this everywhere: Networked systems provide massive capacities but introduce great vulnerabilities. Simple programs bloat with endless features. Things get worse, not better.
Anyone who’s built a career in IT understands this technical debt. Legacy systems persist for decades. Every major operating system - desktop and mobile - has bugs so persistent they seem more like permanent features than temporary mistakes. Yet we constantly build news things on top of these increasingly rickety scaffolds. We do more, so we crash more - our response to that has been to make crashes as nearly painless as possible. The hard lockups and BSODs of a few years ago have morphed a momentary disappearance, as if nothing of real consequence has happened.
Worse still, we seem to regard every aspect of IT with a ridiculous and undeserved sense of permanence. We don’t want to throw away our old computers while they still work. We don’t want to abandon our old programs. Some of that is pure sentimentality - after all, why keep using something that’s slow and increasingly less useful? More of it reflects the investment of time and attention spent learning a sophisticated piece of software.
The processes that software encapsulates will inevitably be examined, improved, refined, and repackaged as other processes.
Yet a commitment to obsolescence is the unspoken agreement for all things IT. Yes, you may treasure that NetWare server with sixteen years continuous uptime, but does it really have utility when everyone, everywhere can access cloud-based data storage API? Embracing the new requires us to loosen our grip on the old.
Some may well be thinking: that way lies madness. If we changed our systems all the time, nothing would work. Consider: nearly every organisation of any scale has legacy (but functioning) systems so old they can no longer be upgraded or even maintained properly. Uptime has become a god, and capacity has been sacrificed on its altar.
Another view is that the industry that creates disruption is ironically terrified to disrupt itself. The biggest vendors cleverly act more as psychiatrists than problem-solvers, soothing fears, reassuring IT managers with gentle whispers of ‘Everything will be alright,’ as both walk a garden path into irrelevance.
Embracing change means abandoning the false sense of stability IT has offered management as part of its bargain to increase productivity. Productivity is not a function of stability. It’s about the wholesale revision of business processes to meet or generate market needs. Productivity demands that we junk everything comfortable, everything safe, everything stable, set our faces to the wind, and explore the unknown. The IT department that fails to heed this lesson fails the business it serves. A recent quip from Saul Kaplan puts it best: “Marginal cost of staying the same is rising. Think of it as inflation eating away at your relevancy rather than capital.”
Hostage to forces that want to contain its disruptive nature, IT has become infrastructure where it should always be a strategic asset, wielded like a blade, cutting a swath through markets and competitors. How many IT departments can say they are the most important element of the business? Not many. That’s the sure sign that IT is itself ready to be utterly disrupted. ®