On-Prem

HPC

UK govt finds £225M for Isambard-AI supercomputer powered by Nvidia

5,448 GraceHopper superchips and 200PFLOPS gets you somewhere in the global public top ten


The UK government says it will cough up £225 million ($273 million) for a supercomputer capable of more than 200 petaFLOPS of double-precision performance, with the University of Bristol to house the machine and Nvidia providing the core computing components.

Detailed during UK's AI Safety Summit Wednesday, the machine – dubbed Isambard-AI – is expected to come online next year and may be able to help organizations studying everything from automated drug discovery and climate change to the application of neural networks in robotics, big data, and national security.

"Isambard-AI represents a huge leap forward for AI computational power in the UK," Simon McIntosh-Smith, director of the Isambard National Research facility, argued in a statement. "Today's Isambard-AI would rank in the top 10 fastest supercomputers in the world and, when in operation later in 2024, it will be one of the most powerful AI systems for open science anywhere."

The number-ten most-powerful publicly known supercomputer in the world right now is China's Tianhe-2A, which theoretically peaks at 100 petaFLOPS and benchmarks 61 petaFLOPS. A peak 200-petaFLOPS machine at FP64 in the UK would rival America's Summit, which was number one until 2020 and today is fifth in the public rankings.

The United Kingdom's fastest machine in the current top-500 public list is Archer2 in 30th place with a theoretical peak speed of 26 petaFLOPS and benchmarks at 20 petaFLOPS. On paper, Isambard-AI is about ten times faster, therefore.

According to the University of Bristol, Isambard-AI, first talked about in September, will employ 5,448 Nvidia GH200 GraceHopper Superchips.

Announced in 2022, Nvidia's GH200 meshes [PDF] a 72-core Arm Grace CPU and a Hopper GPU over a 900GB/s NVLink-C2C interconnect. Each superchip comes equipped with up to 480GB of LPDDR5x memory and either 96GB or 144GB of high bandwidth memory, depending on the configuration.

The chips will be integrated into a liquid cooled chassis developed by HPE Cray, networked using the manufacturer's custom Slingshot 11 interconnect, and supported by almost 25 petabytes of storage.

The full system will be housed in a self-cooled, self-contained datacenter alongside the previously announced Isambard-3 machine at the National Composites Center (NCC) located in the Bristol and Bath Science Park. It will feature a heat-reuse system to warm neighbouring buildings.

Isambard-3, which is due to come online next northern spring, will offer early access to UK scientists during the first phase of the broader Isambard-AI project. That system comprises 384 Nvidia Grace CPU Superchips, each of which packs a pair of 72-core Arm-compatible CPUs and up to 960 gigabytes of LPDDR5x memory.

Among the first beneficiaries of Isambard-AI will be Britain's Frontier AI Task Force, which aims to mitigate the risk of advanced AI applications to the nation's national security. The task force will also work with the AI Safety Institute to develop a research program to evaluate the safety of machine learning models.

Flexible precision an opportunity for both AI and HPC

While Isambard-AI will no doubt be capable in double-precision high-performance compute (HPC) applications, a major focus of the system is AI and other workloads that can take advantage of lower-precision floating-point calculations. Turn down the precision slider to FP8 and optimize for sparsity, and Nvidia says it expects researchers will be able to extract 21 or more exaFLOPS from the system. Presumably that's the theoretical peak.

As the performance figures imply, lower precision floating point calculations trade accuracy for speed. FP8 and FP16 are widely employed for AI training and inferencing for this reason, but as our sibling site The Next Platform has previously pointed out, it also has applications in HPC.

Researchers at Riken have been exploring the use of 32-bit or even 16-bit mathematics in HPC using the Fugaku supercomputer for years now. Meanwhile, the European Center for Medium Range Weather Forecasts have already demonstrated the benefits of 32-bit precisions in weather and climate modelling. Researchers at University of Bristol have had similar successes with their own atmospheric simulations and were able eke out a 3.6x speed up by dropping down to lower precision.

Because of this, Isambard-AI's support for a variety of floating point precisions ranging from FP64 down to sparse FP8, should allow researchers to explore a variety of low or mixed-precision workloads in both emerging AI and HPC arenas.

The Register expects to learn more about Isambard-AI and other upcoming systems when we attend the Supercomputing 2023 event in Denver, Colorado, later this month.

The British government also mentioned on Wednesday another forthcoming UK super called Dawn that we'll cover on Thursday. Nvidia described Isambard-AI as "Britain’s most powerful supercomputer" and the government said it will be the nation's "most advanced computer."

Suffice to say, the people behind Dawn say their computer will be the fastest. We guess we'll find out for sure when the machines are eventually powered on and benchmarked. ®

Send us news
25 Comments

Nvidia joins made-in-America party, hopes to flog $500B in homegrown AI supers by 2029

Blackwell production already underway in Arizona with server manufacturing coming to Texas within 15 months

Nvidia paid $1M for Mar-a-Lago meal, US later scrapped AI chip export crackdown

Best after-dinner mint ever

Congress wants to know if Nvidia superchips slipped through Singapore to DeepSeek

As Huang jets to Middle Kingdom after H20 ban forces $5.5B hit

Trump derails Chinese H20 GPU sales, forcing Nvidia to eat $5.5B this quarter

So much for Jensen's million-dollar dinner at Mar-a-Lago

Billions pour into AI as emissions rise, returns stay pitiful, say Stanford boffins

Models get bulkier, burnier, bank-breakier

Nvidia's latest AI PC boxes sound great – if you're a data scientist with $3,000 to spare

But will they really upend the enterprise PC market? How about software? Networking, anyone?

Meta to feed Europe's public posts into AI brains again

Who said this opt-out approach is OK with GDPR, was it Llama 4, hmm?

AI entrepreneur sent avatar to argue in court – and the judge shut it down fast

We hear from court-scolded Jerome Dewald, who insists lawyer-bots have a future

Global datacenter electricity use to double by 2030, say policy wonks. Yup, it's AI

No worries, just use neural networks to optimize systems powering neural networks

Dot com era crash on the cards for AI datacenter spending? It's a 'risk'

Analysts say the bubble won't burst, but it is possible, admits world's largest colo provider

Apps-from-prompts Firebase Studio is a great example – of why AI can't replace devs

Big G reckons this agentic IDE speeds up or simplifies coding. Developers who've used it aren't so sure

Procter & Gamble study finds AI could help make Pringles tastier, spice up Old Spice, sharpen Gillette

Go on, then, knock yourself out, pal