Nvidia adds AI peanut butter to Nokia's 6G network chocolate, throws in $1 billion
The pair intends to develop cellular infrastructure for running edge AI workloads
Nvidia CEO Jensen Huang on Tuesday announced a partnership with Nokia to integrate AI technology into its mobile network infrastructure, bringing accelerated computing to the edge and paving the way for 6G-ready networks. As part of the deal, Nvidia will invest $1 billion in Nokia. Team Green's gear will boost spectral efficiency and make AI inference more accessible from mobile devices.
Speaking at the company's GTC conference in Washington, DC, Huang said, "This is a brand new opportunity. Remember, the internet enabled communications. But amazingly smart companies, AWS, built a cloud computing system on top of the internet. We are now going to do the same thing on top of the wireless telecommunications network. This new cloud will be an edge, industrial, robotics cloud."
Nvidia's deal with Nokia, one of the largest telecom equipment vendors in the world, will give the company access to Nvidia's AI-RAN products, which make AI available to radio access networks to improve spectral efficiency (AI for RAN) and also will make AI available via cloud computing for wireless communications (AI on RAN). And Nokia will ensure that its 5G and 6G software runs on Nvidia hardware.
The partnership comes with a $1 billion Nvidia investment in Nokia, which lifted the telecom company's shares more than 25 percent in the hours after the announcement. Nvidia has also recently announced investments in Intel ($5 billion) and OpenAI ($100 billion). The GPU biz has a lot of spare change lying around.
"Telecommunications is a critical national infrastructure – the digital nervous system of our economy and security," said Huang in a statement. "Built on Nvidia CUDA and AI, AI-RAN will revolutionize telecommunications – a generational platform shift that empowers the United States to regain global leadership in this vital infrastructure technology."
In conjunction with the partnership, Nvidia is rolling out Aerial RAN Computer Pro (ARC-Pro), a 6G-ready accelerated computing platform for telecommunication companies.
"ARC is built from three fundamental new technologies: the Grace CPU, the Blackwell GPU, and our Mellanox ConnectX networking designed for this application," said Huang during his keynote. "... Aerial is essentially a wireless communication system running atop CUDA-X."
- The Chinese Box and Turing Test: AI has no intelligence at all
- You have one week to opt out or become fodder for LinkedIn AI training
- Forrester warns AI bubble to deflate as enterprises defer spending to 2027
- OpenAI tells Trump to build more power plants or China wins the AI arms race
Nvidia and Nokia expect to develop infrastructure that will allow telecom providers to run distributed edge AI inferencing workloads at scale. According to Nvidia, almost half of ChatGPT's 800 million weekly active users access the site via mobile devices.
"The next leap in telecom isn't just from 5G to 6G – it's a fundamental redesign of the network to deliver AI-powered connectivity, capable of processing intelligence from the data center all the way to the edge," said Nokia president and CEO Justin Hotard in a statement. "Our partnership with Nvidia, and their investment in Nokia, will accelerate AI-RAN innovation to put an AI data center into everyone's pocket."
Nokia and Nvidia say that they intend to work together on AI networking solutions, such as data center switching. This will involve Nokia's SR Linux software for the Nvidia Spectrum-X Ethernet networking platform and adapting Nokia's telemetry and fabric management system to Nvidia AI hardware.
T-Mobile US intends to work with the two companies as part of its effort to deploy 6G wireless technology, scheduled to begin testing in 2026. And Dell will also be participating through the provision of Dell PowerEdge servers to run the AI-RAN system.
Nvidia suggests, with no small amount of self-interest, that investing in infrastructure to run AI workloads on mobile networks is like purchasing a money multiplication machine. "One estimate suggests that telco operators can earn roughly $5 in AI inference revenue from every $1 invested in new AI radio access network (AI-RAN) infrastructure," the company said in a blog post earlier this month.
Separately, Nvidia said it plans to work with Oracle to build an AI supercomputer for the US Department of Energy. The Solstice system is expected to incorporate 100,000 Nvidia Blackwell GPUs. It is not clear when that machine will debut, but Nvidia said another supercomputer, dubbed Equinox, fitted with 10,000 Blackwell GPUs, is scheduled to be ready during the first half of 2026.
Solstice and Equinox will be housed at Argonne National Laboratory, and the expectation is that they'll be connected via Nvidia networking to provide 2,200 exaflops of AI performance. That's a lot of hallucination. ®