Disruptive innovation's like a party. It's always happening elsewhere

Citation needed?

Opinion There may be little to agree on in these fractured, fractious times, but nobody can deny the fact of progress. We see it in tech up close and personal: news keeps coming thick and fast from medicine, material science, energy, cosmology, palaeontology, environmental sciences, you name it. The speed of change is just breathtaking.

Not so fast, chides a new study. By analyzing millions of scientific papers and patents between 1945 and 2010, it claims to have found a precipitous and unrelenting fall in disruptive innovation. Almost all fields in science and technology show more than 90 percent less disruption now than eighty years ago, in some cases none whatsoever.

The analyses seem sound. A research paper might look like a straightforward report of experimental results, or a new theory tested, but it's built on citations – links to previous papers. These papers have their own lists of citations, so every paper in a field is part o a huge interlinked network.

Papers may be the fruiting bodies of science, but citations are the mycelium of science, the underlying structure which describes both the state of the art and its history. Analyze those, and you can find events that change everything, papers whose revolutionary impact propagates out over time much more than any of their contemporaries. And every year, the study says, there are fewer and fewer of these.

As for patents, their role as spreaders of innovation has long been subsumed by the commercial imperative to fight over every square meter of obviousness. You could argue, though, that with so many being piles of lawyer guano, they're poor markers of true innovation.

Scientific papers, despite the publish-or-perish imperative, do have the merits of peer review, so should be better indicators of underlying truth.

The framework is making it invisible

So how do we square what the study says with what we think we see? If innovation had all but stopped by the start of the 21st century, how come we have giant AIs connected by gigabit radio networks to supercomputers in our pockets, bathing in solid-state lighting and reading about exoplanet atmospheric analysis? This is a hugely different world to that of twenty years ago. It sure feels disrupted. Yet the math of the study looks good. Paradox.

Despite disruptive innovation still, y'know, actually still happening, the study's conclusions have lots of potential explanations.

The start of the time period – 1945 – saw the release of a huge amount of wartime research into civilian science and technology, a metaphorical and literal bombshell of ideas. It goes downhill from there.

Also, lots of fundamental ideas happened fairly quickly thereafter that couldn't happen again: the start of semiconductor engineering with the 1947 invention of the transistor, say. Clause Shannon's "Mathematical Theory of Communication" paper in 1948 which created information theory. Two more disruptive ideas you cannot find, but you can't do either of those things again. If you calibrate disruption by events like these, you're going to quickly run out.

A more general explanation along the same lines is what the study calls the low-hanging fruit idea. In the early days of any discipline, there are lots of big ideas to explore. After they've gone, the structure of the field is set in place and disruption becomes much more unlikely. A corollary of this is the rapid expansion of science and technology fields once they're proved fruitful – the same data sets that the study looked at also show that there was about 14 times as much science at the end of the study period as there was at the start. The odd big idea is going to look a lot smaller in the modern context.

Which is all very good, except that innovation hasn't stopped. Whatever the stats show – even if they show it very clearly – it's not the heat death of the universe of ideas.

The first clue is that the reality of science, even more so technology, is not defined by papers and patents. Many enormously significant innovations – Berners-Lee's WWW, Google's Page ranking – escape analysis by not depending on peer review or the legal protection of patents. They show up and they change the world on their own terms. They certainly rely on earlier ideas, just not in a formulaic framework.

Or take something as significant as flash memory, a recent invention that has left its footprint in paper and patent: Neither reflect the scale at which flash has changed things in the real world; a limited space of ideas with outsize effects. And the concept of FOSS is actively hostile to patents – it is incompatible with that atmosphere – and has its own network of ideas outside paper citations. Yet it is most certainly disruptive.

Disruptive innovation, it turns out, isn't always amenable to analysis. It's also not the touchstone of evolution, no matter what billionaires say. If the study shows anything, it's the effect of the meteor strike of World War II – hugely significant but a statistical outlier.

To make a biological analogy, the Big Five planetary extinction events in the geological record were certainly disruptive, but the good stuff in evolution happened between them. Evolutionary selection can only take place where stable environments provide time for pressures to act on populations. A reset may be good, but more most certainly does not mean better.

Rest easy. There are inventions, discoveries and massive surprises aplenty still to come, Even if the core ideas of digital technology as we've known it are settled, they provide an incredible platform to build upon, a stable ecosystem for new species to make their own. AI/ML? Quantum systems? Exotic physics? Metamaterials? There's plenty of disruption to come – just let's try to steer clear of extension. ®

Similar topics

TIP US OFF

Send us news


Other stories you might like