Nvidia's latest AI climate model takes aim at severe weather

That tornado warning couldn't possibly be a hallucination... could it?

While enterprises struggle to quantify the return on investment of AI, the technology continues to show promise in bolstering weather forecasting and climate models.

On Monday, Nvidia unveiled a new generative AI diffusion model, developed in collaboration with Lawrence Berkeley National Lab and the University of Washington, which promises to track the development of storm cells faster and more accurately than existing methods.

The model, dubbed StormCast, targets weather patterns larger than your typical storm but smaller than hurricanes. It was trained on 3.5 years of US National Oceanic and Atmospheric Administration's (NOAA) climate data gathered from the American heartland, where supercells and tornadoes are common foes, especially in hot summer months.

Compared to existing machine learning-assisted weather sims, which typically have a resolution of 30 kilometers and a temporal resolution of six hours, Nvidia says StormCast promises not only greater resolution, in this case down to three kilometers, it's also able to generate new forecasts on an hourly scale.

Combined with precipitation radar, the GPU giant claims StormCast has already proven to be 10 percent more accurate than NOAA's best 3-kilometer regional weather models while allowing lead times of up to six hours.

StormCast is one of several AI models Nvidia has recently developed. At Computex this spring, the GPU giant detailed CorrDiff, a kind of diffusion model trained to rapidly generate high-resolution images at a two kilometer resolution using data from regional weather models over Taiwan.

According to Nvidia, these images are 12.5x higher resolution and can be generated 1,000x faster than existing numerical models. Taiwan is already utilizing CorrDiff to predict the impact of typhoons in the region.

Both models are part of the GPU-giant's Earth-2 climate simulation. When combined, the GPU giant says, StormCast allows CoreDiff to make new predictions based on past ones.

Nvidia isn't the only one exploring ways to augment climate and forecast models with AI. In a paper published in the journal Nature late last month, Google shared its work with the European Centre for Medium-Range Weather Forecasts (ECMWF) to not only enhance physics-based climate models with machine learning, but port them to TPUs and GPUs, making it far less costly to run than CPU-based compute clusters.

Dubbed NeuralGCM, the model works in part by swapping less accurate secondary models called parameterizations, which are used to track smaller-scale phenomena like clouds and precipitation, with neural networks trained on existing weather data from ECMWF.

Of the three models Google developed using this approach, it claimed its 1.4 degree model could simulate the atmosphere for a year in just eight minutes compared to 20 days running on a state-of-the-art climate model like X-SHiELD.

While machine learning and generative AI models have the potential to improve global weather forecasts, there's still considerable work to be done. NeuralGCM, Google noted, isn't a full-blown climate model, at least not yet. Meanwhile, Nvidia's CorrDiff was designed specifically to track the weather around Taiwan. ®

More about

TIP US OFF

Send us news


Other stories you might like