Nvidia promises annual updates across CPU, GPU, DPU lines
Arm one year, x86 the next, and always faster than a certain chip shop that still can't ship even one standalone GPU
Computex Nvidia's push deeper into enterprise computing will see its practice of introducing a new GPU architecture every two years brought to its CPUs and data processing units (DPUs, aka SmartNICs).
Speaking on the company's pre-recorded keynote released to coincide with the Computex exhibition in Taiwan this week, senior vice president for hardware engineering Brian Kelleher spoke of the company's "reputation for unmatched execution on silicon." That's language that needs to be considered in the context of Intel, an Nvidia rival, again delaying a planned entry to the discrete GPU market.
"We will extend our execution excellence and give each of our chip architectures a two-year rhythm," Kelleher added.
Products will be updated in years between the release of new architectures.
"One year we will focus on x86, the next on Arm," Kelleher said, without explaining what happens if that means a single year could feature both a product and architecture update – which could be a lot to swallow. We've asked Nvidia to clarify what this policy means.
Whatever the details, Nvidia now considers it has three architectures: one for GPUs, one for CPUs like the Grace-Hopper superchips announced today, and one for its Bluefield range of DPUs.
- Nvidia teases server designs for Grace-Hopper Superchips
- Nvidia CEO Jensen Huang talks chips, GPUs, metaverse
- Intel's Habana unit reveals new Nvidia A100 challengers
- Nvidia brings liquid cooling to A100 PCIe GPU cards for ‘greener’ datacenters
In its keynote, Nvidia execs said the three are inseparable in its vision for enterprise computing – with CPUs managing systems, GPUs doing the heavy lifting, and DPUs providing networking and isolation services. Using all three types of processor in harness, execs said, is its secret sauce for the kind of scalability and speed Nvidia feels is needed to run AI, or seize opportunities like the $100 billion market for cloud-streamed games.
Speaking of games, the keynote also featured news of an Nvidia-commissioned study that found the company's RTX visual computing platform, when used in shooting games, allows players to achieve a better rate of firing accuracy. We mention this because the study found that the lowest quartile of players improved more than the best players.
If this enterprise AI thing doesn't work out, your correspondent imagines marketing those RTX results to parents everywhere could see Nvidia grow regardless. ®