Why Climate Alarmist Reports Should Be Ignored Where They Use Bad Methodology and Data

http://ift.tt/2fwc8u5

Like other liberal news outlets, the New York Times has been busy printing unapproved internal Trump Administration material this year. On August 8, 2017 they printed a Draft Report as part of a new National Climate Assessment. It was prepared primarily during the Obama Administration by a Federal inter-agency group and is still residing on an outside server from an earlier public comment period. They concluded, among other things, that “Many lines of evidence demonstrate that human activities, especially emissions of greenhouse (heat-trapping) gases, are primarily responsible for recent observed climate change.”

The problem is not that the viewpoints expressed are new or useful or that the draft was not already available; rather they represent a rather tired repetition of the usual climate alarmist ideology with only occasional updates. This is unfortunate since it is becoming ever clearer that the ideology has become scientifically indefensible and needs to be abandoned in favor of a new approach to climate science.

Perhaps the Most Basic Problem

Perhaps the most basic problem with this Draft Report, like most of the major Climate Industrial Complex (CIC) reports, is that it primarily depends for its justification on the IPCC’s bottom-up global climate models (as they discuss in Section 4.3 of the Draft Report). The Draft Report shows that the climate alarmists have by no means given up their horrifically expensive and misguided crusade to reduce carbon dioxide (CO2) emissions, despite that the alarmists’ very extensive attempt to justify it is hopeless.

Not only is their conclusion that global warming is primarily due to human activity, but also that temperatures will increase significantly because of increases in anthropogenic atmospheric CO2. Their basic methodology is based on the UN Intergovernmental Panel on Climate Change’s (IPCC’s) analyses conducted over many years. The Heartland Institute has gone to great effort to point out many of the problems and inconsistencies in the conclusions reached using these models. But it is increasingly clear why the IPCC has been having a hard time explaining the increasing divergence between their models and actual temperatures. One of the basic problems is that alarmists have always used a bottom-up approach in their methodology (which is to aggregate the results for individual geographic areas based on the application of subjective physical relationships between various physical effects). This approach cannot produce valid results no matter how much is spent on it, how often it is repeated, or how large the climate models they use. As Mike Jonas has recently written:

    In this very uncertain world of climate, one thing is just about certain: No bottom-up computer model will ever be able to predict climate. We learned above [in the article this was excerpted from] that there isn’t enough computer power now even to model GCRs [galactic cosmic rays], let alone all the other climate factors. But the issue of computer model ability goes way beyond that. In a complex non-linear system like climate, there are squillions of situations where the outcome is indeterminate. That’s because the same influence can give very different results in slightly different conditions. Because we can never predict the conditions accurately enough – in fact we can’t even know what all the conditions are right now – our bottom-up climate models can never ever predict the future. And the climate models that provide guidance to governments are all bottom-up.

The bottom-up GCM was a bad approach from the start and should never have been paid for by the taxpayers. All that we have are computer models that were designed and then tuned to lead to the IPCC’s desired answers and have had a difficult time even doing that.

So not only are the results claiming that global temperatures are largely determined by atmospheric CO2 wrong, but the basic methodology is useless. Climate is a coupled, non-linear chaotic system, and the IPCC agrees that this is the case. It cannot be usefully modeled by using necessarily limited models which assume the opposite.

An Entirely New Approach Is Needed

Despite repeated claims by climate alarmists that climate science is settled, nothing could be further from the case. In fact, an entirely new approach is needed if much progress is to be made in characterizing and understanding the climate system. This approach must be a top-down rather than a bottom-up approach. To my knowledge, only one such study (and earlier versions thereof) exists taking this approach, which I will call the 2017 WCD report after the authors’ last names. And it appears to give plausible results. It says that CO2 does not have a significant effect on global temperatures and that global temperatures can be fully explained since about 1960 by entirely natural factors and do not require any human activity to explain what has occurred. This rules out many if not most of the Draft Report’s conclusions.

A second very recent report including two of the same authors as WCD 2017 concludes that the keepers of the official global surface temperature records have repeatedly “adjusted” them to the point that they are no longer representative of the underlying data. Accordingly, the authors argue that the data used in the Draft Report from surface temperature sources and the conclusions reached from using this data are too unreliable for policy use.

The Time Has Come to Abandon the IPCC’s Bottom-up Approach and Correct the Basic Data Used Before Further Expenditures Are Made

It is time to totally abandon the IPCC’s bottom-up climate models as an ultra expensive sunk cost and start over. The 2017 WCD report would be a good place to start in redoing the basic climate analyses. Until this is done, little progress is possible in many of the major issues in climate science, and no further expenditures should be made responding to climate alarmism until the new methodology has been thoroughly tested and the basic surface temperature data has been reconstituted in a useful form. The mistaken choice of methodology has ended up costing taxpayers tens of billions in research costs and has reportedly resulted in about $1.5 trillion per year for renewable and related construction, which needs to be written off too.

I recommend that the Trump Administration issue the Draft Report with an added section explaining how useless and biased the rest of the Draft Report is because it primarily relies on meaningless model results and unreliable surface temperature data. If such a combined report were issued it would be one of the first government reports anywhere to seriously question the IPCC’s results, and has long been needed. Scientific hypotheses and data that have never been rigorously tested are not fit to be used for public policy purposes, and particularly for those involving multi-trillion expenditures per year.

Vía The Freedom Pub http://ift.tt/2fvspzw

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s