Digital Realty CTO on why storage is the datacenter challenge no one's talking about

And why supporting quantum computing is easier than supporting a 1MW rack

Interview When the great and the good of the datacenter world got together in Cannes last week for Datacloud Global Congress, storage was barely mentioned, with shortages of GPUs, power and land the key talking points.

In fact, storage was literally not on the agenda. The CEO of one datacenter operator told us: "It's still GPUs, and then at the end of the day, we only lean into the design requirements that our customers want. We don't design the internals of the data. So I haven't heard of any particular challenges."

Another senior executive told us that while storage is rarely a key topic of conversation, power is. But with enterprises holding vast amounts of redundant data on their arrays, actively managing duplicates – using the tools they've usually already paid for – could have meaningful impact on power consumption. More efficient software could boost that further.

However that doesn't mean storage is no longer a problem. Rather, as the AI explosion that is driving datacenter expansion picks up pace, traditional storage has to be reshaped, says Digital Realty CTO, Chris Sharp.

"Traditional storage has been solved," says Sharp. "But what we're starting to see is with a lot of these models data is everything. AI is only as smart as the data sets you feed it ..."

As GPUs ramp up in capability and price, CIOs are increasingly fretful about utilization rates. Broadcom's Peter Del Vecchio, launching its Tomahawk 6 switching chip last week, said that 57 percent of the "time" invested in LLM training was due to data transfers. So far, much of the focus on improving this has been on boosting networking speeds and building out vast amounts of ultra-fast memory.

But that has inevitable knock-on effects for storage, Sharp says, in terms of "the speed at which storage needs to be accessed, where it's now just ultra fast memory as a part of the broader model. So storage is embedded in all of this accelerated compute. Without it, it doesn't work."

"Big data arrays, that stuff's been solved for," he continues. But this was not necessarily the case with "performant storage, and then how you architect your data so that it's actionable."

Sharp says this is "more of a philosophy thing, less of a hardware thing." But there are obvious questions around read-write rates and "the size that you can put this into and the power."

"We're looking for efficiencies across the entire tech stack, from optical networking, like all the way through storage, through any type of accelerated compute, all the way through the CPUs that feed this. So there's an entire system. Storage is getting more performant, but the power is increasing across all of the tech stack."

He says storage was "an underpin" of its Service Fabric global interconnect and orchestration platform. Asked if current storage architectures were "good enough" to support it, he replies, "It's OK."

In most environments, storage is good enough. But in fast evolving, high-performance applications, "Where the data is so disparate and where it's being generated, storage is not solved for."

One key talking point at Datacloud Global Congress was the increased focus on inference, which is where organizations should start to reap the benefits of all those investments in training – whether by them, or more likely the hyperscalers. "It's about the monetization of AI."

He says enterprises are beginning to realize their storage needed to be more performant, "so that my AI algorithms can be more actionable in a quicker fashion. Because, just to kind of cut all the noise buzzword bingo, everybody wants an actionable outcome out of their data as quickly as possible."

After all, he says, when it comes down to it, gen AI is all about tokens. And "What is a token? It's probably just an actual piece of data that a business wants to be able to get a hold of quickly."

The problem is that this still tends to be dwarfed by more headline grabbing challenges. For Sharp, these include the advent of quantum computing.

"As hard as it is to look at AI," he says, "like, when is AI? Where is AI? Is it agentic? When is AGI, sentience?.. Quantum is even harder."

Sharp is confident that some finance firms are already exploiting the technology. "But when I really scratch at the surface to get under the hood, I really don't get a good answer."

Likewise, he says, while Digital Realty is "always watching" the technology, "No two systems are alike, and no qubits are alike… the repeatability and the survivability of these qubits, I get a different answer every week."

One thing he is confident on is that the company can "support the environment" for quantum, which right now means near absolute zero temperatures.

"I wouldn't say it's trivial, but it's easier than doing some of these one megawatt thermal displacement designs for Nvidia… the thermal dynamics with that are harder than what quantum is representing today." ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like