US weather forecasters triple supercomputing oomph with latest machines
NOAA makes it rain for General Dynamics IT, HPE, AMD
Predicting the weather is a notoriously tricky enterprise, but that’s never held back America's National Oceanic and Atmospheric Administration (NOAA).
After more than two years of development, the agency brought a pair of supercomputers online this week that it says are three times as powerful as the machines they replace, enabling more accurate forecast models.
Developed and maintained by General Dynamics Information Technology under an eight-year contract, the Cactus and Dogwood supers — named after the fauna native to the machines' homes in Phoenix, Arizona, and Manassas, Virginia, respectively — will support larger, higher-resolution models than previously possible.
The cost to build, house, and support these machines, now operational, will be at least $150 million, we understand; the contract can run to an estimated $505 million.
“People are looking for the best possible weather forecast information that they can get,” Brian Gross, director of the Environmental Modeling Center for the National Weather Service, told The Register.
In order to provide better predictions, larger and more complex models are needed, and that’s where Cactus and Dogwood come in, he explained.
The new systems “will allow us to better capture the physical processes that are going on in the Earth's systems, like formation of clouds, the formation of precipitation, and so forth,” Gross said, adding that they will also enable the National Weather Service to make multiple predictions simultaneously.
The agency claims the improved storage and compute capabilities offered by the twin systems also open the door to more diverse physics models. These include an upgrade to the US Global Forecast System planned for this fall, as well as a new hurricane forecast model called the Hurricane Analysis and Forecast System slated to come online ahead of the 2023 storm season.
Beyond traditional forecast modeling, Cactus and Dogwood will also serve as a testbed for novel applications created by an open community of public and private developers as part of the US Unified Forecast System over the next five years.
Once the models have been updated to take advantage of the systems, expect to see longer, more accurate forecasts and predictions, David Michaud, director for the Office of Central Processing at the National Weather Service, told The Register. “For things like hurricanes or tropical weather-related forecasts, that can be a very important thing.”
- Astra fails, sends NASA's Tropics weather satellites back to Earth
- Germany to host Europe's first exascale supercomputer
- AI and ML could save the planet – or add more fuel to the climate fire
- All-AMD US Frontier supercomputer ousts Japan's Fugaku as No. 1 in Top500
The systems running these models are among NOAA’s most powerful to date, each boasting 12.1 petaflops of peak FP64 performance in production and 26PB of storage. They are said to be three times faster than their predecessors: two Cray and IBM supercomputers in Reston, Virginia, and Orlando, Florida, which will be replaced by Cactus and Dogwood.
The latest duo were built by Cray, which is Hewlett Packard Enterprise’s supercomputing division responsible for building the Oak Ridge National Laboratory’s chart-topping Frontier supercomputer.
For this spring’s Top500 list, the systems each utilized AMD’s 64-core Epyc Rome processors — 327,680 cores worth — and HPE’s Slingshot-10 interconnects to achieve 10.01 petaflops of performance in the Linpack bench. This landed the twins in the No. 49 (Cactus) and 50 (Dogwood) spots, notably without the assistance of GPU acceleration.
According to Michaud, the decision to forgo GPUs was influenced in part by the existing code base used by the models. In this case, a GPU-less system offered the greatest performance per dollar.
“It takes a lot of effort to shift the code base to be optimized for a certain type of processor,” he said, adding that that’s not to say NOAA isn’t investigating use cases for GPU acceleration of these models. “We do have a research and development component of our NOAA high-performance computing program, and within that program… we've been exploring GPUs for a number of years.”
According to NOAA, the Cactus and Dogwood systems boost its supercomputing capacity beyond 42 petaflops of combined performance. The agency has research supercomputers in West Virginia, Tennessee, Mississippi, and Colorado, which have a combined capacity of 18 petaflops.
It should be noted that its latest systems won’t be individually tasked. Instead, Cactus and Dogwood will serve as the administration's primary and backup forecasting systems, enabling a degree of redundancy should one of the systems experience a fault.
Whether the agency's five-day forecasts turn out to be more accurate or not, well, you can quite easily be the judge of that, if you live stateside.
For more analysis and data points, check out Timothy Prickett Morgan's take of the news on The Next Platform. ®