Mystic Met Office predicts neighbourhood Thermageddon

Modelling 'totally inadequate' last year - why trust it now?


On Thursday, the Met Office launched its new report on global warming: UK Climate Projections 2009, otherwise known as UKCP09. This is based on the output of Hadley Centre climate models that predict temperature increases of up to 6°C with wetter winters, dryer summers, more heatwaves, rising sea levels, more floods and all the other catastrophes that one would expect from similar exercises in alarmism.

What makes this report different from any of its predecessors is the resolution of the predictions that the Met Office is making. They are not just presenting a general impression of what might happen globally during this century, or even how climate change could affect the UK as a whole. They are claiming that they can predict what will happen in individual regions of the country - down to a 25km square. You can enter your postcode and find out how your street will be affected by global warming in 2040 or 2080.

All this is rather unexpected. In May last year, I posted here and here about a world summit of climate modellers that took place at Reading University. On the agenda was one very important problem for them; even the most powerful super-computers that have been developed so far are not capable of running the kind of high resolution models that they claim would allow them to reduce the degree of uncertainty in their predictions, and also make detailed regional predictions that policy makers would like to have so that they can build climate change into infrastructure planning.

Here are a couple of excerpts from the conference website:

The climate modelling community is therefore faced with a major new challenge: Is the current generation of climate models adequate to provide societies with accurate and reliable predictions of regional climate change, including the statistics of extreme events and high impact weather, which are required for global and local adaptation strategies? It is in this context that the World Climate Research Program (WCRP) and the World Weather Research Programme (WWRP) asked the WCRP Modelling Panel (WMP) and a small group of scientists to review the current state of modelling, and to suggest a strategy for seamless prediction of weather and climate from days to centuries for the benefit of and value to society.

A major conclusion of the group was that regional projections from the current generation of climate models were sufficiently uncertain to compromise this goal of providing society with reliable predictions of regional climate change.

Modellers also fretted that the GCMs, or General Circulation Models, were blunt instruments.

Current generation climate models have serious limitations in simulating regional features, for example, rainfall, mid-latitude storms, organized tropical convection, ocean mixing, and ecosystem dynamics. What is the scientific strategy to improve the fidelity of climate models?

This was summed up by Julia Slingo (at that time Professor of Meteorology at Reading University, who also chaired part of the conference) in a report by Roger Harrabin on the BBC News website:

So far modellers have failed to narrow the total bands of uncertainties since the first report of the Intergovernmental Panel on Climate Change (IPCC) in 1990.

And Julia Slingo from Reading University admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.

“We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.

“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.”

Doom Your Way

Professor Slingo said several hundred million pounds of investment were needed.

“In terms of re-building something like the Thames Barrier, that would cost billions; it’s a small fraction of that.

“And it would allow us to tell the policymakers that they need to build the barrier in the next 30 years, or maybe that they don’t need to.”

If, since the conference, several hundred million pounds had been invested in producing a new generation of supercomputers, a thousand times more powerful than the present generation, and the Met Office had already developed and run the kind of high resolution models which were so far beyond the scientist’s grasp just a year ago, then I suspect that this might have seeped into the media and we would have head about it. So far as I am aware, the fastest supercomputers are still a thousand times slower than the modellers consider necessary for credible regional scale modelling of the climate.

So I wondered whether Professor Slingo had anything to say about the Met Office’s new report.

Similar topics

Broader topics


Other stories you might like

  • Corporate investments are a massive hidden source of carbon emissions
    Just because companies are publicly decreasing carbon footprints doesn't mean their cash isn't doing the opposite

    Many large corporations are taking measures to reduce their carbon footprints, but a new report claims that for some, the greatest source of emissions is actually from investments being made with their wealth, and this is undermining their own environmental efforts.

    The Carbon Bankroll report highlights the documented carbon dioxide emissions of a number of large corporations and contrasts these with pollutants being generated as a result of the cash and investments held by those companies, comprising cash, cash equivalents, and marketable securities.

    In some instances, this figure is greater than the emissions generated by their own business, demonstrating, in the words of the report, that "climate accomplishments are being undermined by a misaligned financial system that is channeling hundreds of billions of corporate US dollars into the carbon-intensive sectors driving the climate crisis."

    Continue reading
  • Amazon's solution to save the planet: AWS vouchers, training for more eco startups
    Web giant is used to seeing green

    Amazon is giving out funding and support to more startups developing technology that points us in the direct of net-zero emissions, as part of its AWS Clean Energy Accelerator program.

    The accelerator will provide 12 eco-minded companies with guidance on how to get more out of the AWS cloud, by training their employees on machine learning, analytics, and high-performance computing. Each startup will also get up to $100,000 in AWS Activate credits, double what was offered to the program's first cohort of ten startups announced in July 2021.

    Howard Gefen, GM of AWS' energy industry business unit, said in a canned statement that despite climate change being the defining issue of our age, the technology needed to achieve today's grand environmental goals isn't there. The Clean Energy Accelerator program is supposed to help foster the development of this green tech we're lacking.

    Continue reading
  • What will help enterprises meet sustainability goals? Algorithms, says Oracle
    If you want to retain customers, Big Red recommends putting AI in charge

    The pandemic has made people more concerned about sustainability than ever, and businesses are the focuses of their collective ire, with most saying they don't take enterprise sustainability goals (ESGs) seriously. The solution, Oracle says, is to put AIs in charge.

    Oracle's 2022 ESG Global Study surveyed some 11,000 consumers and businesses, and its findings reveal a population overwhelmingly frustrated with a lack of progress toward sustainability initiatives (94 percent). Seventy-eight percent also say that they're frustrated with the lack of progress businesses have made on the ESG front.

    Consumers aren't content to let businesses pat themselves on the back either: nearly half said that they believe businesses have more power than individuals or governments to affect change, and 89 percent said they need to see proof that progress is being made toward ESG goals.

    Continue reading
  • Climate model code is so outdated, MIT starts from scratch
    Julia replaces Fortran as the basis for Earth's new digital twin

    When faced with climate models coded in Fortran in the 1960s and 70s, MIT decided there wasn't any more cobbling together left for the ancient code, so they decided to toss it out and start fresh. 

    It's an ambitious project for MIT professors Raffaele Ferrari and Noelle Eckley Selin, who submitted their Bringing Computation to the Climate Challenge proposal as part of MIT's Climate Grand Challenges (CGC). Out of 100 submissions, MIT picked five projects to fund and support, one of which is Ferrari and Selin's. 

    "The goal of this grand challenge is to provide accurate and actionable scientific information to decision-makers to inform the most effective mitigation and adaptation strategies," the proposal said. 

    Continue reading
  • Swedish firms ink deal to make green hydrogen with wind power
    Last week, colocating datacenters and sewage plants: this week, renewables and H2 producers

    A project to produce green hydrogen using wind power is planned in the mid-east of Sweden, which is expected to have the ability to make up to 240 tons of the stuff on-site every day.

    However, work on the proposed facility is not expected to begin until 2025, and it may not be operational until 2030.

    The project is described as a partnership between wind farm operator WPD Offshore AB and Lhyfe, a green hydrogen producer. The pair said they intend to jointly install a 600MW hydrogen production plant in an industrial area of the municipality of Söderhamm, in the immediate vicinity of the Storgrundet offshore wind farm operated by WPD, to produce green hydrogen that can be used by industry as well as in the transport sector.

    Continue reading
  • Microsoft datacenter to heat homes in Finland
    Turns out the internet is a set of tubes after all

    Microsoft and Finland's largest energy company have partnered to build a new datacenter near Helsinki that will heat homes as it cools servers.

    Microsoft and Fortum made the announcement today after several years of development, with the final location chosen specifically for the purpose of moving waste datacenter heat via existing water pipes to homes and businesses in the surrounding cities of Espoo and Kauniainen, as well as the municipality of Kirkkonummi.

    According to Microsoft, the datacenter could create up to 11,000 jobs, with its purpose being to provide cloud services to the Finnish public sector, businesses, and individuals, as well as reduce response times for local cloud customers. The facility will be part of Microsoft's global cloud complex of more than 200 datacenters.

    Continue reading

Biting the hand that feeds IT © 1998–2022