Global Warming is real, argues sceptic mathematician - it just isn't Thermageddon

IPCC hid the good news? Let's find out


'Noisy' data

Q. So away from the modelling, which is questionable as evidence, it seems that climate science really comes down disputes about how you process the numbers statistically. Tell me where Bayes fits in.

Lewis: Bayesian maths went out of favour in a big way because of its subjectivity. It came back in favour about 30-40 years ago, and it's stayed: it's very well suited to computer simulation and Monte Carlo methods. It has some good theoretical basis. The problem is that people haven’t got to grips thoroughly with the choice of Prior.

Q. A prior is a "seed" or a "nudge"?

Lewis: Yes, it's a nudge or seeding. In most cases, it gets overridden by the data and it doesn’t matter what prior you use. But it's problematic in climate science - there isn't much data: we only have one climate. And the data is very noisy. Because of that, the PRIOR has a huge influence.

In an ideal world they would not use either Uniform or Expert Priors. They would either use an Objective Bayesian method, or they would use an non-Bayesian method, such as more classical statistical method like profile likelihood. Profile likelihood and objective Bayesian give almost the same results, although it’s not 100 per cent accurate.

The Uniform Prior hugely fastens the tail, even for a quite well defined estimate - Gregory.

Adding a prior creates Thermageddon

ECS estimates in the IPCC AR5 WG1 report, simplified. The bars show a 5-95 per cent certainty range. The blob is the best estimate. The mauve dashed bar illustrates the effect of using a uniform prior on the same data. Lewis (2012) is included for comparison.


Click here for source.

Lewis: You can see how much this matters in the AR5's chart of sensitivity estimates. Those mauve lines are Gregory 2006. [Unlike almost all contemporary ECS estimates Gregory's 2006 study didn't run too hot] The original study basis is the short solid bar - the best estimate is 1.6, and the top is 3.5. The dashed mauve line is what happens when they put it on a Uniform Prior Basis - it goes up 50 per cent. It pushes the top of the 95 per cent certainty range from 3.5˚C up to 8˚C. Some go higher, actually, to 14˚C - but they cut them all off at 10˚C and renormalise them. That's entirely the Uniform Prior.

So catastrophic warming predictions are a trick of a one subjective statistical input?

Lewis: Or models. The problem with expert priors here is that they dominate the results. The studies are valueless - all they show you is what Prior or seed you put in. The level of understanding of Bayes in climate science is very poor.

Q. How many use objective or subjective priors?

Lewis: The Objective Prior school is in the minority. However the Subjective Prior Bayesian school - where you pick your own Prior - only produces Subjective - only valid for the person doing the study. If you read Lindsey, he's one of the great statisticians and he died recently, there’s a lovely quote from him: “the probability you get is personal to the investigator" If you hear Rougier lecturing, he doesn't say "this is a climate table" - "he says this is my climate table".


+Comment

It's left as an exercise as a reader to account for how "global warming" became "catastrophic warming" over a 20-year period. But this dynamic can't be discounted:

"I can’t overstate the HUGE amount of political interest in the project as a message that the Government can give on climate change to help them tell their story. They want the story to be a very strong one and don’t want to be made to look foolish," DEFRA official Kathryn Humphrey wrote to the head of the East Anglia climate research unit in 2009. Many such pleadings have flown between the bureaucracy and the Academy in recent years - and politicians echoed the demand to be told "what to do" recently.

Nobody wanted to demur. And here we all are. ®

Useful links

Lewis & [Marcel] Crok: "How the IPCC hid the good news on Global Warming" (Short Version and Long Versions from here)


Other stories you might like

  • Warehouse belonging to Chinese payment terminal manufacturer raided by FBI

    PAX Technology devices allegedly infected with malware

    US feds were spotted raiding a warehouse belonging to Chinese payment terminal manufacturer PAX Technology in Jacksonville, Florida, on Tuesday, with speculation abounding that the machines contained preinstalled malware.

    PAX Technology is headquartered in Shenzhen, China, and is one of the largest electronic payment providers in the world. It operates around 60 million point-of-sale (PoS) payment terminals in more than 120 countries.

    Local Jacksonville news anchor Courtney Cole tweeted photos of the scene.

    Continue reading
  • Everything you wanted to know about modern network congestion control but were perhaps too afraid to ask

    In which a little unfairness can be quite beneficial

    Systems Approach It’s hard not to be amazed by the amount of active research on congestion control over the past 30-plus years. From theory to practice, and with more than its fair share of flame wars, the question of how to manage congestion in the network is a technical challenge that resists an optimal solution while offering countless options for incremental improvement.

    This seems like a good time to take stock of where we are, and ask ourselves what might happen next.

    Congestion control is fundamentally an issue of resource allocation — trying to meet the competing demands that applications have for resources (in a network, these are primarily link bandwidth and router buffers), which ultimately reduces to deciding when to say no and to whom. The best framing of the problem I know traces back to a paper [PDF] by Frank Kelly in 1997, when he characterized congestion control as “a distributed algorithm to share network resources among competing sources, where the goal is to choose source rate so as to maximize aggregate source utility subject to capacity constraints.”

    Continue reading
  • How business makes streaming faster and cheaper with CDN and HESP support

    Ensure a high video streaming transmission rate

    Paid Post Here is everything about how the HESP integration helps CDN and the streaming platform by G-Core Labs ensure a high video streaming transmission rate for e-sports and gaming, efficient scalability for e-learning and telemedicine and high quality and minimum latencies for online streams, media and TV broadcasters.

    HESP (High Efficiency Stream Protocol) is a brand new adaptive video streaming protocol. It allows delivery of content with latencies of up to 2 seconds without compromising video quality and broadcasting stability. Unlike comparable solutions, this protocol requires less bandwidth for streaming, which allows businesses to save a lot of money on delivery of content to a large audience.

    Since HESP is based on HTTP, it is suitable for video transmission over CDNs. G-Core Labs was among the world’s first companies to have embedded this protocol in its CDN. With 120 points of presence across 5 continents and over 6,000 peer-to-peer partners, this allows a service provider to deliver videos to millions of viewers, to any devices, anywhere in the world without compromising even 8K video quality. And all this comes at a minimum streaming cost.

    Continue reading

Biting the hand that feeds IT © 1998–2021