Japan's boffins: Global warming isn't man-made

Climate science is 'ancient astrology', claims report


Predicting the Future with Numerical Simulation

Kanya Kusano, Japan Agency for Marine-Earth Science & Technology (JAMSTEC)

Numerical simulation by forecast models are generally classified as theoretical models and empirical models. The former follows universal laws and carries out predictive calculations, the latter makes models that are thought to be realistic from data of phenomenon. These two methods cannot be strictly differentiated, generally experiential methods gradually become theoretical methods, finally becoming the generally accepted dogma.

Celestial mechanics originated in astrological prediction of solar and lunar eclipses, calendars were experiential predictions; mechanistic theory evolved when we reached an era of accurate computation. Consequently, the predictability of celestial mechanics became extremely high and practical estimates gave way to proof. Similarly, modern Global Climate Models still largely dependent on empirical models. Fundamental principles, therefore must resolve very complex physical/chemical/biological processes and phenomenon. That is why many artificial optimization operations (parameterization tuning) are needed, or we will not be able to reproduce the phenomenon. Because of this, besides mathematical accuracy, the people who construct models' choice of processes and optimum operating guidelines will have large scale effects on the calculated results.

1. Scientific Understanding and Uncertainty

When constructing models, if our scientific understanding is poor, we are not able to capture the model. But we should pay attention to the importance of the naturally occurring processes when our scientific understanding is not yet clearly decided.

In the IPCC's 4th Evaluation Report, a few potentially major processes were discussed; but [since] scientific understanding was too low to decide, the evaluation of these was omitted. In order to scientifically understand the uncertainty of accurate estimates according to the potential importance of these processes, "the cause of lack of scientific understanding and uncertainty" must be assessed.

Finally, uncertainty estimates should be included. For example, the effect of variances in cosmic ray activity on clouds, caused by sunspot activity, solar flares accompanied by energetic protons striking the upper atmosphere and generating NOx and ozone effects [*], etc., are not sufficiently understood and incorporated into the models.

Also, there are great uncertainties in reproducing historical TSI (Total Solar Irradiance), TSI fluctuation and spectral change related climate sensitivity estimates are inadequate.

Similar topics


Other stories you might like

  • Google Pixel 6, 6 Pro Android 12 smartphone launch marred by shopping cart crashes

    Chocolate Factory talks up Tensor mobile SoC, Titan M2 security ... for those who can get them

    Google held a virtual event on Tuesday to introduce its latest Android phones, the Pixel 6 and 6 Pro, which are based on a Google-designed Tensor system-on-a-chip (SoC).

    "We're getting the most out of leading edge hardware and software, and AI," said Rick Osterloh, SVP of devices and services at Google. "The brains of our new Pixel lineup is Google Tensor, a mobile system on a chip that we designed specifically around our ambient computing vision and Google's work in AI."

    This latest Tensor SoC has dual Arm Cortex-X1 CPU cores running at 2.8GHz to handle application threads that need a lot of oomph, two Cortex-A76 cores at 2.25GHz for more modest workloads, and four 1.8GHz workhorse Cortex-A55 cores for lighter, less-energy-intensive tasks.

    Continue reading
  • BlackMatter ransomware gang will target agriculture for its next harvest – Uncle Sam

    What was that about hackable tractors?

    The US CISA cybersecurity agency has warned that the Darkside ransomware gang, aka BlackMatter, has been targeting American food and agriculture businesses – and urges security pros to be on the lookout for indicators of compromise.

    Well known in Western infosec circles for causing the shutdown of the US Colonial Pipeline, Darkside's apparent rebranding as BlackMatter after promising to go away for good in the wake of the pipeline hack hasn't slowed their criminal extortion down at all.

    "Ransomware attacks against critical infrastructure entities could directly affect consumer access to critical infrastructure services; therefore, CISA, the FBI, and NSA urge all organizations, including critical infrastructure organizations, to implement the recommendations listed in the Mitigations section of this joint advisory," said the agencies in an alert published on the CISA website.

    Continue reading
  • It's heeere: Node.js 17 is out – but not for production use, says dev team

    EcmaScript 6 modules will not stop growing use of Node, claims chair of Technical Steering Committee

    Node.js 17 is out, loaded with OpenSSL 3 and other new features, but it is not intended for use in production – and the promotion for Node.js 16 to an LTS release, expected soon, may be more important to most developers.

    The release cycle is based on six-monthly major versions, with only the even numbers becoming LTS (long term support) editions. The rule is that a new even-numbered release becomes LTS six months later. All releases get six months of support. This means that Node.js 17 is primarily for testing and experimentation, but also that Node.js 16 (released in April) is about to become LTS. New features in 16 included version 9.0 of the V8 JavaScript engine and prebuilt Apple silicon binaries.

    "We put together the LTS release process almost five years ago, it works quite well in that we're balancing [the fact] that some people want the latest, others prefer to have things be stable… when we go LTS," Red Hat's Michael Dawson, chair of the Node.js Technical Steering Committee, told The Register.

    Continue reading

Biting the hand that feeds IT © 1998–2021