This article is more than 1 year old
Good luck building a VR PC: Ethereum miners are buying all the GPUs
Once the tap turns on again, GPUs will restore PCs and edge computing to glory
Last month, one of my friends noted he’d been having enormous trouble trying to buy the components to assemble a virtual-reality-ready PC. Motherboards, memory, CPUs and solid state drives were easy to find, but the one absolutely essential component - a beefy GPU to drive a head-mounted display at a vomit-preventing 90 Hz - he couldn’t find anywhere. Every online vendor seemed to be out of stock, with long waiting times and stern warnings restricting purchases to ‘ONLY TWO PER HOUSEHOLD’. Why would anyone need two graphics cards? One for each eye?
This shortage had developed suddenly, over the month of May, in a curious lock-step with a seemingly unrelated development - an enormous rise in the price of a cryptocurrency known as Ethereum. A successor to Bitcoin, this second-generation of ‘magic internet money’ algorithmically grows its total money supply, rather like a central bank. Rather than employ legions of macroeconomists and quantitative easing, Ethereum demands ‘proof of work‘ - solutions to a cryptographic puzzle that require billions of educated guesses. Those guesses - you guessed it - can be tremendously accelerated by the very same GPUs that my friend had tried to buy.
Economics, it turns out, was the culprit. As Ethereum ballooned to its highest historical value (nearly USD $400 for a single ETH), it became a wise investment to buy a cheap PC with a beefy power supply, stock it up with GPUs, and let it compute its way into profits. One such PC, could - in the right circumstances - earn up to $10,000 a year, for a $2500 outlay. The formula, simply put: GPUs + electricity + time = profits!
Now that all the GPUs have been sucked into get-rich-quick schemes, what happens to a commercial and enterprise VR marketplace still trying to find its footing? This is less about kids at home playing PC games (though they’re clearly affected by this as well) than a story of all the other things we’ve come to depend on from our GPUs. These fast-and-efficient successors to the ‘math coprocessor’ (remember when those were a thing?) have become the single most important elements in computing.
While graphics provide the obvious use case for GPUs, they're also the accelerant for all sorts of workloads.
The rise of VR, machine learning - even this ‘tulipmania’ quality GPU shortage - points toward a larger shift. The generational tug-of-war between the centre and the periphery. PCs subverted mainframes, then the cloud drew everything back toward the centre. While the cloud hasn’t quite peaked, the next swing of the pendulum, into ‘edge computing’, enabled by highly performant GPUs, looks to be well underway.
Moving away from the general-purpose CPU (Intel has been four years at 14nm), this shift to the edge will make the GPU more important than the CPU. The GPU is central to all the roles we expect 21st century computers to fill. This may be the reason Apple recently ditched chip designer Imagination in favour of their own, home-grown GPU. Within a few years, every computing device of consequence - supercomputer, desktop or smartphone - will be driven by architectures and operating systems that center around the GPU.
Meanwhile, the global shortage of GPUs implies fat profits for nVidia and AMD as they shovel their chips into the maw of a market hungry to turn maths into capacity - and money. The PC, nearly driven into irrelevance by tablet computing, comes roaring back as a platform for visualisation and learning, transformed by the GPU into a power-hungry, expensive, finicky and absolutely essential tool for modern business. The pendulum swings again, and suddenly the edges are (yet again) the most interesting place to be, the place where the real work of computing happens.
For the moment, though, my friend has to patiently wait out this shortage. The chips will come: there’s too much money on the table. And as they arrive in their billions, the entire face of computing will change completely. ®