Please sir, may we have some Moore? Doesn't look that way
We're on a roadmap to nowhere. Come on inside
Opinion Nvidia has just shown off its vision of the near future in the shape of its Blackwell Ultra. Aptly for a company that helps gamers explore dystopian science-fiction hellscapes, Nvidia's actual future involves vast, heat-soaked stacks of silicon, guzzling energy by the half-gigawatt.
For datacenters, as discussed by The Register's fully sentient bio-brains ticking along very nicely on 40 watts of mammalian metabolism, this offers a path to higher performance per cubic meter, but with highly challenging problems of power and cooling. 'Twas ever thus, but it's much worse now. There's now close to a 1:1 relationship between hiking FLOPS and the power needed.
In the days of our forebears, there was Moore's Law. Of course it was never actually a law, more an early observation of a trend in semiconductor physics. Starting in the mid-1960s, as fabrication techniques got better, the size of individual components on a silicon chip got smaller. As these features shrank, more fitted into the same space, while simultaneously taking less power per component – smaller transistors take less energy to switch on and off. You got twice the performance of the previous generation every two years or so, doubling speed and capacity for the same unit cost in silicon. If you increase clock speed, which you could do, it did take more power but with even more processing oomph.
It was a very sweet deal, especially as cheap chips started to be capable of more than pocket calculators. A marvelous feedback loop developed, with ever more useful devices driving the digitization of business and pleasure, and that increased digitization driving demand for ever faster, ever cheaper, ever more capable machinery. The final tipping point, just as the computers became good enough to displace analog consumer devices, was when exponential growth made broadband and wireless networking affordable. It takes copious amounts of digital signal processing to move big data long distances at high speed, and here it was. Upgrades all round, hurrah!
That's how digital technology got to eat everything in half a human lifetime, and now it is no Moore. The industry has been extremely reluctant to admit it, constantly reframing the law in ever more baroque ways to disguise the smell. Performance was increasing because of multiple cores? See, it's still working. You can squash chips together more closely? Same again, barkeep. Clever on-chip zoning of voltage and clock, boosting performance without cooking the circuits? Moore, please.
There is still a bit to squeeze out of the silicate stone. Better interconnects, engineering of vias on multi-layer devices, packaging, device geometry, perhaps even photonics – all will keep engineers busy for a while.
The basic driver has hit the buffers, with the physics of silicon semiconductors stretching fundamental limits as the cost of even trying rises astronomically. Spintronics, graphene, pure optical, and any number of exotic silicon-based non-CMOS ideas have had their moment on the lab bench before quietly being shelved.
The result is what Nvidia is offering datacenters: chips drawing kilowatts of power doing highly specific, highly parallel tasks. It makes sense if AI grows as hyped, although even then there will be serious pencil sharpening as the equations of money, sustainability, and markets look ever more non-deterministic. But how does it look for the rest of us?
We know, because we've been there for a while. Upgrading for better performance has been more cargo cult than canny calculation. Windows 11 needs performant hardware? You can get it pre-installed on a wheezy little Pi-like N100-based mini PC for the price of a Taylor Swift concert ticket. Any flagship phone of the past five years will feel identical to a normal person, one who thinks snapdragons are flowers.
Discretionary consumer spending on new digital tech is approaching the audiophile event horizon, with most waiting for the old stuff to break before looking at new, and a few splashing cash according to groupthink. Business IT, already happy to limit capital spending until the wheels actually fall off, has called the Windows 11 bluff already. If you have an M1 Macbook and are good at shepherding the battery, you may well not need another laptop for decades.
The future is going to look very different. Efficiency will be everything, and here the biggest gains will be in software engineering. "Gordon Moore giveth and Bill Gates taketh away" used to be an accurate maxim. Software has always grown fatter in the knowledge that the next generation of hardware will have faster CPUs and more memory to disguise the fact. But that's no longer the case.
Software has so many ways to get better, but hardware has few. Look there for innovation and the unexpected, just not at the hands of Big Tech, which is very much against that sort of thing.
- Nvidia GPU roadmap confirms it: Moore's Law is dead and buried
- Worry not. China's on the line saying AGI still a long way off
- AMD's Victor Peng: AI thirst for power underscores the need for efficient silicon
- ASML and Imec unveil pricy High NA EUV playground for chipmakers
As for hardware, we are moving towards the aviation model, where physics and economics has led to two big players producing variations of a basic architecture laid down in the 1960s. Efficiency and cost control over product life is everything. Meanwhile, the daily experience of flying has become a bit better in some ways, and grown worse in others, but the core A to B performance remains identical.
For any new companies, the cost of entry is insurmountable, at least until political upheaval breaks things. Intel knows this, but its attempts to become the Boeing of binary have emulated aviation's disasters rather than its commercial acumen, so who the final two big players will be is still an open question.
For those who have lived and worked through Moore's finest years, this may seem a miserable prospect. It's not. We have built such an astonishing new world that there will be breakthrough innovations unseating apparently invincible incumbents. That could come from anywhere except the old guard. They're left with racks of molten silicon that can't do with 600 kW what any of us can do with a kilogram of neurons and the energy of a soldering iron.
Thanks for the tools, Gordon. We'll take it from here. ®