There were some questions on my previous post on Man Made Global Warming, so I decided to post on some of the scientific errors on the pseudoscience of Global Warming....
Image from article:
https://www.hoover.org/research/flawed-climate-models
Article by David R Henderson addressing the scientific mistakes in the Peudo-science of Man Made Global Warming:
https://www.hoover.org/research/flawed-climate-models
As I had already posted the problems with the science on Man Made Global Warming. Here is a peer reviewed (AKA Real Science) paper on scientific errors on the current "scientific" analysis. His analysis is pretty good, missing only the fact that CO2 is fed back corrected by plant growth, so it is not just a chemical, isolated to itself. But his analysis is spot on.....
Scientists present measurement error by describing the range around their measurements. They might, for example, say that a temperature is 20˚C ±0.5˚C. The temperature is probably 20.0˚C, but it could reasonably be as high as 20.5˚C or as low as 19.5˚C.
Now consider the temperatures that are recorded by weather stations around the world.
Patrick Frank is a scientist at the Stanford Synchrotron Radiation Lightsource (SSRL), part of the SLAC National Accelerator Laboratory at Stanford University. Frank has published papers that explain how the errors in temperatures recorded by weather stations have been incorrectly handled. Temperature readings, he finds, have errors over twice as large as generally recognized. Based on this, Frank stated, in a 2011 article in Energy & Environment, “…the 1856–2004 global surface air temperature anomaly with its 95% confidence interval is 0.8˚C ± 0.98˚C.” The error bars are wider than the measured increase. It looks as if there’s an upward temperature trend, but we can’t tell definitively. We cannot reject the hypothesis that the world’s temperature has not changed at all.
We need several more decades of data to even begin to estimate temperature on a global scale.
Climate models are used to assess the CO2-global warming hypothesis and to quantify the human-caused CO2 “fingerprint.”
How big is the human-caused CO2 fingerprint compared to other uncertainties in our climate model? For tracking energy flows in our model, we use watts per square meter (Wm–2). The sun’s energy that reaches the Earth’s atmosphere provides 342 Wm–2—an average of day and night, poles and equator—keeping it warm enough for us to thrive. The estimated extra energy from excess CO2—the annual anthropogenic greenhouse gas contribution—is far smaller, according to Frank, at 0.036 Wm–2, or 0.01 percent of the sun’s energy. If our estimate of the sun’s energy were off by more than 0.01 percent, that error would swamp the estimated extra energy from excess CO2. Unfortunately, the sun isn’t the only uncertainty we need to consider.
The more complex a system is, the more difficult it is to predict the result; and the more important careful your science must be. As the system gets larger, it becomes critical to define every variable, and to remember that every variable must be measured to an accuracy at least ten times finer than the variable itself.
Other complications come into play with variables like clouds:
Clouds reflect incoming radiation and also trap it as it is outgoing. A world entirely encompassed by clouds would have dramatically different atmospheric temperatures than one devoid of clouds. But modeling clouds and their effects has proven difficult.
CO2 is a poor choice of a control variable:
Even the relationship between CO2 concentrations and temperature is complicated.
The glacial record shows geological periods with rising CO2 and global cooling and periods with low levels of atmospheric CO2 and global warming. Indeed, according to a 2001 article in Climate Research by astrophysicist and geoscientist Willie Soon and his colleagues, “atmospheric CO2 tends to follow rather than lead temperature and biosphere changes.”
Everyone of the climate models have been tested to see if the can predict recorded temperature variations:
Before we put too much credence in any climate model, we need to assess its predictions. The following points highlight some of the difficulties of current models.
Vancouver, British Columbia, warmed by a full degree in the first 20 years of the 20th century, then cooled by two degrees over the next 40 years, and then warmed to the end the century, ending almost where it started. None of the six climate models tested by the IPCC reproduced this pattern. Further, according to scientist Patrick Frank in a 2015 article in Energy & Environment, the projected temperature trends of the models, which all employed the same theories and historical data, were as far apart as 2.5˚C.
According to a 2002 article by climate scientists Vitaly Semenov and Lennart Bengtsson in Climate Dynamics, climate models have done a poor job of matching known global rainfall totals and patterns.
This is a simple cross check of a computer model. Feed them recorded information and check accuracy on Historic temperatures. NONE of them have done very well, and until they do, we can NOT use them to predict future temperature variations!
:'(