Good luck building a VR PC: Ethereum miners are buying all the GPUs

Once the tap turns on again, GPUs will restore PCs and edge computing to glory


Last month, one of my friends noted he’d been having enormous trouble trying to buy the components to assemble a virtual-reality-ready PC. Motherboards, memory, CPUs and solid state drives were easy to find, but the one absolutely essential component - a beefy GPU to drive a head-mounted display at a vomit-preventing 90 Hz - he couldn’t find anywhere. Every online vendor seemed to be out of stock, with long waiting times and stern warnings restricting purchases to ‘ONLY TWO PER HOUSEHOLD’. Why would anyone need two graphics cards? One for each eye?

This shortage had developed suddenly, over the month of May, in a curious lock-step with a seemingly unrelated development - an enormous rise in the price of a cryptocurrency known as Ethereum. A successor to Bitcoin, this second-generation of ‘magic internet money’ algorithmically grows its total money supply, rather like a central bank. Rather than employ legions of macroeconomists and quantitative easing, Ethereum demands ‘proof of work‘ - solutions to a cryptographic puzzle that require billions of educated guesses. Those guesses - you guessed it - can be tremendously accelerated by the very same GPUs that my friend had tried to buy.

Economics, it turns out, was the culprit. As Ethereum ballooned to its highest historical value (nearly USD $400 for a single ETH), it became a wise investment to buy a cheap PC with a beefy power supply, stock it up with GPUs, and let it compute its way into profits. One such PC, could - in the right circumstances - earn up to $10,000 a year, for a $2500 outlay. The formula, simply put: GPUs + electricity + time = profits!

Now that all the GPUs have been sucked into get-rich-quick schemes, what happens to a commercial and enterprise VR marketplace still trying to find its footing? This is less about kids at home playing PC games (though they’re clearly affected by this as well) than a story of all the other things we’ve come to depend on from our GPUs. These fast-and-efficient successors to the ‘math coprocessor’ (remember when those were a thing?) have become the single most important elements in computing.

While graphics provide the obvious use case for GPUs, they're also the accelerant for all sorts of workloads.

The rise of VR, machine learning - even this ‘tulipmania’ quality GPU shortage - points toward a larger shift. The generational tug-of-war between the centre and the periphery. PCs subverted mainframes, then the cloud drew everything back toward the centre. While the cloud hasn’t quite peaked, the next swing of the pendulum, into ‘edge computing’, enabled by highly performant GPUs, looks to be well underway.

Moving away from the general-purpose CPU (Intel has been four years at 14nm), this shift to the edge will make the GPU more important than the CPU. The GPU is central to all the roles we expect 21st century computers to fill. This may be the reason Apple recently ditched chip designer Imagination in favour of their own, home-grown GPU. Within a few years, every computing device of consequence - supercomputer, desktop or smartphone - will be driven by architectures and operating systems that center around the GPU.

Meanwhile, the global shortage of GPUs implies fat profits for nVidia and AMD as they shovel their chips into the maw of a market hungry to turn maths into capacity - and money. The PC, nearly driven into irrelevance by tablet computing, comes roaring back as a platform for visualisation and learning, transformed by the GPU into a power-hungry, expensive, finicky and absolutely essential tool for modern business. The pendulum swings again, and suddenly the edges are (yet again) the most interesting place to be, the place where the real work of computing happens.

For the moment, though, my friend has to patiently wait out this shortage. The chips will come: there’s too much money on the table. And as they arrive in their billions, the entire face of computing will change completely. ®

Similar topics


Other stories you might like

  • These Rapoo webcams won't blow your mind, but they also won't break the bank

    And they're almost certainly better than a laptop jowel-cam

    Review It has been a long 20 months since Lockdown 1.0, and despite the best efforts of Google and Zoom et al to filter out the worst effects of built-in laptop webcams, a replacement might be in order for the long haul ahead.

    With this in mind, El Reg's intrepid reviews desk looked at a pair of inexpensive Rapoo webcams in search for an alternative to the horror of our Dell XPS nose-cam.

    Rapoo sent us its higher-end XW2K, a 2K 30fps device and, at the other end of the scale, the 720p XW170. Neither will break the bank, coming in at around £40 and £25 respectively from online retailers, but do include some handy features, such as autofocus and a noise cancelling microphone.

    Continue reading
  • It's one thing to have the world in your hands – what are you going to do with it?

    Google won the patent battle against ART+COM, but we were left with little more than a toy

    Column I used to think technology could change the world. Google's vision is different: it just wants you to sort of play with the world. That's fun, but it's not as powerful as it could be.

    Despite the fact that it often gives me a stomach-churning sense of motion sickness, I've been spending quite a bit of time lately fully immersed in Google Earth VR. Pop down inside a major city centre – Sydney, San Francisco or London – and the intense data-gathering work performed by Google's global fleet of scanning vehicles shows up in eye-popping detail.

    Buildings are rendered photorealistically, using the mathematics of photogrammetry to extrude three-dimensional solids from multiple two-dimensional images. Trees resolve across successive passes from childlike lollipops into complex textured forms. Yet what should feel absolutely real seems exactly the opposite – leaving me cold, as though I've stumbled onto a global-scale miniature train set, built by someone with too much time on their hands. What good is it, really?

    Continue reading
  • Why Cloud First should not have to mean Cloud Everywhere

    HPE urges 'consciously hybrid' strategy for UK public sector

    Sponsored In 2013, the UK government heralded Cloud First, a ground-breaking strategy to drive cloud adoption across the public sector. Eight years on, and much of UK public sector IT still runs on-premises - and all too often - on obsolete technologies.

    Today the government‘s message boils down to “cloud first, if you can” - perhaps in recognition that modernising complex legacy systems is hard. But in the private sector today, enterprises are typically mixing and matching cloud and on-premises infrastructure, according to the best business fit for their needs.

    The UK government should also adopt a “consciously hybrid” approach, according to HPE, The global technology company is calling for the entire IT industry to step up so that the public sector can modernise where needed and keep up with innovation: “We’re calling for a collective IT industry response to the problem,” says Russell MacDonald, HPE strategic advisor to the public sector.

    Continue reading

Biting the hand that feeds IT © 1998–2021