This article is more than 1 year old

It's nice y'all like our chips but half our data-center sales are from cloud giants, FYI, says Nvidia's chief beancounter

Hyperscalers snap up GPUs for AI inference

Sales to hyperscalers now account for half of all Nvidia’s data-center revenues, the graphics giant’s chief financial officer Colette Kress said during a speech at the Bank of America Securities 2020 Global Technology Conference.

Kress didn’t elaborate on what exactly Nvidia’s hyperscale clients are hungry for, but highlighted strong enthusiasm for the outfit's AI inference-accelerating GPU hardware, which now accounts for “double digits” of the chip slinger's overall data-center business. Hyperscalers are the top-tier cloud giants – think Google, Amazon, Microsoft, Baidu, Apple, and so on.

“Inferencing adoption continues to go quite well; this quarter we still have reached where inferencing is in the double-digits percent of our overall data center business,” she told BofA’s Vivek Arya.

“Importantly, it's doubled year-over-year in terms of revenue and growth of using a GPU for many of the overall inferencing workloads that are out there.”

Mark Zuckerberg

Nvidia's A100 GPU coming to a cloud near you, DARPA details AI war games, Intel wants to help scan your brain

READ MORE

This reveal adds context to Nvidia’s earlier Q1 figures, which painted a rosy picture of the GPU goliath's data-center business. That division's revenues were up 80 per cent year-on-year, and accounted for 37 per cent of the corp's overall revenue.

Kress hinted toward continuing strength in Nvidia’s DC business in Q2, with sales bolstered by new products, including its first Ampere-based card, the A100.

Announced at Nvidia’s GTC conference, the A100 is the first card to ship from Nvidia using a 7mm process. It packs 54.2 billion transistors, 6,912 FP32 CUDA cores, 3,456 FP64 CUDA cores, and 432 Tensor cores.

Nvidia plans to issue this to customers in a single rack-mounted box: the DGX A100, which retails at a touch under $200,000, and promises five petaflops of performance.

The A100 will also be included in kit from third-party manufacturers, with Supermicro having taken the lead, offering three systems packing A100 GPUs. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like