Digital Realty CTO weighs in on AI's insatiable thirst for power

If the grid can't keep up, DCs may be forced to roll their own primary supplies, Sharp tells El Reg

Interview It's no secret the GPUs used to train and run generative AI models are power hungry little beasts.

So hungry in fact that by some estimates, datacenters in the US could use as much as nine percent of the nation's electricity generation by the end of the decade. In light of this explosive growth, some have warned that AI infrastructure could drain the grid.

Chris Sharp, CTO at colocation provider Digital Realty, is well aware of the challenges associated with accommodating these workloads.

Compared to the servers that run traditional workloads, such as virtual machines, containers, storage, and databases, hardware-accelerated AI is a different animal. A single rack of GPU servers today can easily consume 40 kW or more. Next-gen rack-scale systems from Nvidia and others will require at least 100 kW.

According to Sharp, accommodating these demanding systems at scale isn't easy and requires a different way of thinking about datacenter power and cooling, which you can learn more about in this interview with The Register below.

Youtube Video

It's possible datacenters could end up looking wildly different. Sharp suggests small nuclear reactors (SMRs) and other primary onsite power generation may play a role. ®

More about


Send us news

Other stories you might like