Global Bullies, Climate Models and Dishonesty in Climate Science

By Dr. DAVID EVANS | THE REAL AGENDA | MAY 25, 2012

The debate about global warming has reached ridiculous proportions and is full of micro thin half-truths and misunderstandings. I am a scientist who was on the carbon gravy train, understands the evidence,  was once an alarmist, but am now a skeptic. Watching this issue unfold has been amusing but, lately, worrying. This issue is tearing society apart, making fools and liars out of our politicians.

Let’s set a few things straight.

The whole idea that carbon dioxide is the main cause of the recent global warming is based on a guess that was proved false by empirical evidence during the 1990s. But the gravy train was too big, with too many jobs, industries, trading profits, political careers, and the possibility of world government and total control riding on the outcome. So rather than admit they were wrong, the governments, and their tame climate scientists, now cheat and lie outrageously to maintain the fiction that carbon dioxide is a dangerous pollutant.

Let’s be perfectly clear. Carbon dioxide is a greenhouse gas, and other things being equal, the more carbon dioxide in the air, the warmer the planet. Every bit of carbon dioxide that we emit warms the planet. But the issue is not whether carbon dioxide warms the planet, but how much.

Most scientists, on both sides, also agree on how much a given increase in the level of carbon dioxide raises the planet’s temperature, if just the extra carbon dioxide is considered. These calculations come from laboratory experiments; the basic physics have been well known for a century.

The disagreement comes about what happens next.

The planet reacts to that extra carbon dioxide, which changes everything. Most critically, the extra warmth causes more water to evaporate from the oceans. But does the water hang around and increase the height of moist air in the atmosphere, or does it simply create more clouds and rain? Back in 1980, when the carbon dioxide theory started, no one knew. The alarmists guessed that it would increase the height of moist air around the planet, which would warm the planet even further, because the moist air is also a greenhouse gas.

This is the core idea of every official climate model: for each bit of warming due to carbon dioxide, they claim it ends up causing three bits of warming due to the extra moist air. The climate models amplify the carbon dioxide warming by a factor of three – so two thirds of their projected warming is due to extra moist air (and other factors), only one third is due to extra carbon dioxide.

I’ll bet you didn’t know that. Hardly anyone in the public does, but it’s the core of the issue. All the disagreements, lies, and misunderstanding spring from this. The alarmist case is based on this guess about moisture in the atmosphere, and there is simply no evidence for the amplification that is at the core of their alarmism. Which is why the alarmists keep so quiet about it and you’ve never heard of it before. And it tells you what a poor job the media have done in covering this issue.

Weather balloons had been measuring the atmosphere since the 1960s, many thousands of them every year. The climate models all predict that as the planet warms, a hot-spot of moist air will develop over the tropics about 10km up, as the layer of moist air expands upwards into the cool dry air above. During the warming of the late 1970s, 80s, and 90s, the weather balloons found no hot-spot. None at all. Not even a small one. This evidence proves that the climate models are fundamentally flawed, that they greatly overestimate the temperature increases due to carbon dioxide.

This evidence first became clear around the mid 1990s.

At this point official “climate science” stopped being a science. You see, in science empirical evidence always trumps theory, no matter how much you are in love with the theory. If theory and evidence disagree, real scientists scrap the theory. But official climate science ignored the crucial weather balloon evidence, and other subsequent evidence that backs it up, and instead clung to their carbon dioxide theory — that just happens to keep them in well-paying jobs with lavish research grants, and gives great political power to their government masters.

There are now several independent pieces of evidence showing that the earth responds to the warming due to extra carbon dioxide by dampening the warming. Every long-lived natural system behaves this way, counteracting any disturbance, otherwise the system would be unstable. The climate system is no exception, and now we can prove it.

But the alarmists say the exact opposite, that the climate system amplifies any warming due to extra carbon dioxide, and is potentially unstable. Surprise surprise, their predictions of planetary temperature made in 1988 to the US Congress, and again in 1990, 1995, and 2001, have all proved much higher than reality.

They keep lowering the temperature increases they expect, from 0.30C per decade in 1990, to 0.20C per decade in 2001, and now 0.15C per decade – yet they have the gall to tell us “it’s worse than expected”. These people are not scientists. They over-estimate the temperature increases due to carbon dioxide, selectively deny evidence, and now they cheat and lie to conceal the truth.

One way they cheat is in the way they measure temperature.

The official thermometers are often located in the warm exhaust of air conditioning outlets, over hot tarmac at airports where they get blasts of hot air from jet engines, at wastewater plants where they get warmth from decomposing sewage, or in hot cities choked with cars and buildings. Global warming is measured in tenths of a degree, so any extra heating nudge is important. In the US, nearly 90% of official thermometers surveyed by volunteers violate official siting requirements that they not be too close to an artificial heating source. Nearly 90%! The photos of these thermometers are on the Internet; you can get to them via the corruption paper at my site, sciencespeak.com. Look at the photos, and you’ll never trust a government climate scientist again.

They place their thermometers in warm localities, and call the results “global” warming. Anyone can understand that this is cheating. They say that 2010 is the warmest recent year, but it was only the warmest at various airports, selected air conditioners, and certain car parks.

Global temperature is also measured by satellites, which measure nearly the whole planet 24/7 without bias. The satellites say the hottest recent year was 1998, and that since 2001 the global temperature has leveled off.

So it’s a question of trust.

If it really is warming up as the government climate scientists say, why do they present only the surface thermometer results and not mention the satellite results? And why do they put their thermometers near artificial heating sources? This is so obviously a scam now.

So what is really going on with the climate?

The earth has been in a warming trend since the depth of the Little Ice Age around 1680. Human emissions of carbon dioxide were negligible before 1850 and have nearly all come after WWII, so human carbon dioxide cannot possibly have caused the trend. Within the trend, the Pacific Decadal Oscillation causes alternating global warming and cooling for 25 – 30 years at a go in each direction. We have just finished a warming phase, so expect mild global cooling for the next two decades.

We are now at an extraordinary juncture.

Official climate science, which is funded and directed entirely by government, promotes a theory that is based on a guess about moist air that is now a known falsehood. Governments gleefully accept their advice, because the only way to curb emissions are to impose taxes and extend government control over all energy use. And to curb emissions on a world scale might even lead to world government — how exciting for the political class!

A carbon tax?

Even if Australia stopped emitting all carbon dioxide tomorrow, completely shut up shop and went back to the stone age, according to the official government climate models it would be cooler in 2050 by about 0.015 degrees. But their models exaggerate tenfold – in fact our sacrifices would make the planet in 2050 a mere 0.0015 degrees cooler!

Sorry, but you’ve been had.

Finally, to those of you who still believe the planet is in danger from our carbon dioxide emissions: sorry, but you’ve been had. Yes carbon dioxide a cause of global warming, but it’s so minor it’s not worth doing much about.

This article first appeared on JoNova

Dr David Evans consulted full-time for the Australian Greenhouse Office (now the Department of Climate Change) from 1999 to 2005, and part-time 2008 to 2010, modeling Australia’s carbon in plants, debris, mulch, soils, and forestry and agricultural products. Evans is a mathematician and engineer, with six university degrees including a PhD from Stanford University in electrical engineering. The area of human endeavor with the most experience and sophistication in dealing with feedbacks and analyzing complex systems is electrical engineering, and the most crucial and disputed aspects of understanding the climate system are the feedbacks. The evidence supporting the idea that CO2 emissions were the main cause of global warming reversed itself from 1998 to 2006, causing Evans to move from being a warmist to a skeptic.

Climate Sensitivity Reconsidered Part 1

A special report from Christopher Monckton of Brenchley for all Climate Alarmists, Consensus Theorists and Anthropogenic Global Warming Supporters

January 20, 2011

Abstract

The Intergovernmental Panel on Climate Change (IPCC, 2007) concluded that anthropogenic CO2 emissions probably
caused more than half of the “global warming” of the past 50 years and would cause further rapid warming. However,
global mean surface temperature TS has not risen since 1998 and may have fallen since late 2001. The present analysis
suggests that the failure of the IPCC’s models to predict this and many other climatic phenomena arises from defects in its
evaluation of the three factors whose product is climate sensitivity:

1) Radiative forcing ΔF;
2) The no-feedbacks climate sensitivity parameter κ; and
3) The feedback multiplier f.
Some reasons why the IPCC’s estimates may be excessive and unsafe are explained. More importantly, the conclusion is
that, perhaps, there is no “climate crisis”, and that currently-fashionable efforts by governments to reduce anthropogenic
CO2 emissions are pointless, may be ill-conceived, and could even be harmful.

The context

LOBALLY-AVERAGED land and sea surface absolute temperature TS has not risen since 1998 (Hadley Center; US National Climatic Data Center; University of Alabama at Huntsville; etc.). For almost seven years, TS may even have fallen (Figure 1). There may be no new peak until 2015 (Keenlyside et al., 2008).

The models heavily relied upon by the Intergovernmental Panel on Climate Change (IPCC) had not projected this multidecadal stasis in “global warming”; nor (until trained ex post facto) the fall in TS from 1940-1975; nor 50 years’ cooling in Antarctica (Doran et al., 2002) and the Arctic (Soon, 2005); nor the absence of ocean warming since 2003 (Lyman et al., 2006; Gouretski & Koltermann, 2007); nor the onset, duration, or intensity of the Madden-Julian intraseasonal oscillation, the Quasi-Biennial Oscillation in the tropical stratosphere, El Nino/La Nina oscillations, the Atlantic Multidecadal Oscillation, or the Pacific Decadal Oscillation that has recently transited from its warming to its cooling phase (oceanic oscillations which, on their own, may account for all of the observed warmings and coolings over the past half-century: Tsonis et al., 2007); nor the magnitude nor duration of multicentury events such as the Medieval Warm Period or the Little Ice Age; nor the cessation since 2000 of the previously-observed growth in atmospheric methane concentration (IPCC, 2007); nor the active 2004 hurricane season; nor the inactive subsequent seasons; nor the UK flooding of 2007 (the Met Office had forecast a summer of prolonged droughts only six weeks previously); nor the solar Grand Maximum of the past 70 years, during which the Sun was more active, for longer, than at almost any
similar period in the past 11,400 years (Hathaway, 2004; Solanki et al., 2005); nor the consequent surface “global warming” on Mars, Jupiter, Neptune’s largest moon, and even distant Pluto; nor the eerily- continuing 2006 solar minimum; nor the consequent, precipitate decline of ~0.8 °C in TS from January 2007 to May 2008 that has canceled out almost all of the observed warming of the 20th century.

Figure 1
Mean global surface temperature anomalies (°C), 2001-2008


An early projection of the trend in TS in response to “global warming” was that of Hansen (1988), amplifying Hansen (1984) on quantification of climate sensitivity. In 1988, Hansen showed Congress a graph projecting rapid increases in TS to 2020 through “global warming” (Fig. 2):

Figure 2
Global temperature projections and outturns, 1988-2020


To what extent, then, has humankind warmed the world, and how much warmer will the world become if the current rate of increase in anthropogenic CO2 emissions continues? Estimating “climate sensitivity” – the magnitude of the change in TS after doubling CO2 concentration from the pre-industrial 278 parts per million to ~550 ppm – is the central question in the scientific debate about the climate. The official answer is given in IPCC (2007):

“It is very likely that anthropogenic greenhouse gas increases caused most of the observed increase in [TS] since the mid-20th century. … The equilibrium global average warming expected if carbon dioxide concentrations were to be sustained at 550 ppm is likely to be in the range 2-4.5 °C above pre-industrial values, with a best estimate of about 3 °C.”

Here as elsewhere the IPCC assigns a 90% confidence interval to “very likely”, rather than the customary 95% (two standard deviations). There is no good statistical basis for any such quantification, for the object to which it is applied is, in the formal sense, chaotic. The climate is “a complex, nonlinear, chaotic object” that defies long-run prediction of its future states (IPCC, 2001), unless the initial state of its millions of variables is known to a precision that is in practice unattainable, as Lorenz (1963; and see Giorgi, 2005) concluded in the celebrated paper that founded chaos theory –
“Prediction of the sufficiently distant future is impossible by any method, unless the present conditions are known exactly. In view of the inevitable inaccuracy and incompleteness of weather observations, precise, very-long-range weather forecasting would seem to be nonexistent.”  The Summary for Policymakers in IPCC (2007) says –“The CO2 radiative forcing increased by 20% in the last 10 years (1995-2005).”

Natural or anthropogenic CO2 in the atmosphere induces a “radiative forcing” ΔF, defined by IPCC (2001: ch.6.1) as a change in net (down minus up) radiant-energy flux at the tropopause in response to a perturbation. Aggregate forcing is natural (pre-1750) plus anthropogenic-era (post-1750) forcing. At 1990, aggregate forcing from CO2 concentration was ~27 W m–2 (Kiehl & Trenberth, 1997). From 1995-2005, CO2 concentration rose 5%, from 360 to 378 W m–2, with a consequent increase in aggregate forcing (from Eqn. 3 below) of ~0.26 W m–2, or <1%. That is one-twentieth of the value
stated by the IPCC. The absence of any definition of “radiative forcing” in the 2007 Summary led many to believe that the aggregate (as opposed to anthropogenic) effect of CO2 on TS had increased by 20% in 10 years. The IPCC – despite requests for correction – retained this confusing statement in its report.  Such solecisms throughout the IPCC’s assessment reports (including the insertion, after the scientists had completed their final draft, of a table in which four decimal points had been right-shifted so as to multiply tenfold the observed contribution of ice-sheets and glaciers to sea-level rise), combined with a heavy reliance upon computer models unskilled even in short-term projection, with initial values of key
variables unmeasurable and unknown, with advancement of multiple, untestable, non-Popperfalsifiable theories, with a quantitative assignment of unduly high statistical confidence levels to nonquantitative statements that are ineluctably subject to very large uncertainties, and, above all, with the now-prolonged failure of TS to rise as predicted (Figures 1, 2), raise questions about the reliability and hence policy-relevance of the IPCC’s central projections.

Dr. Rajendra Pachauri, chairman of the UN Intergovernmental Panel on Climate Change (IPCC), has recently said that the IPCC’s evaluation of climate sensitivity must now be revisited. This paper is a respectful contribution to that re-examination.

The IPCC’s method of evaluating climate sensitivity

We begin with an outline of the IPCC’s method of evaluating climate sensitivity. For clarity we will concentrate on central estimates. The IPCC defines climate sensitivity as equilibrium temperature change ΔTλ in response to all anthropogenic-era radiative forcings and consequent “temperature feedbacks” – further changes in TS that occur because TS has already changed in response to a forcing – arising in response to the doubling of pre-industrial CO2 concentration (expected later this century).  ΔTλ is, at its simplest, the product of three factors: the sum ΔF2x of all anthropogenic-era radiative forcings at CO2 doubling; the base or “no-feedbacks” climate sensitivity parameter κ; and the feedback
multiplier f, such that the final or “with-feedbacks” climate sensitivity parameter λ = κ f. Thus –

ΔTλ = ΔF2x κ f = ΔF2x λ, (1)
where f = (1 – bκ)–1, (2)

such that b is the sum of all climate-relevant temperature feedbacks. The definition of f in Eqn. (2) will be explained later. We now describe seriatim each of the three factors in ΔTλ: namely, ΔF2x, κ, and f.

1. Radiative forcing ΔFCO2, where (C/C0) is a proportionate increase in CO2 concentration, is given by several formulae in IPCC (2001, 2007). The simplest, following Myrhe (1998), is Eqn. (3) –

ΔFCO2 ≈ 5.35 ln(C/C0) ==> ΔF2xCO2 ≈ 5.35 ln 2 ≈ 3.708 W m–2. (3)

To ΔF2xCO2 is added the slightly net-negative sum of all other anthropogenic-era radiative forcings, calculated from IPCC values (Table 1), to obtain total anthropogenic-era radiative forcing ΔF2x at CO2 doubling (Eqn. 3). Note that forcings occurring in the anthropogenic era may not be anthropogenic.

Table 1
Evaluation of ΔF2x from the IPCC’s anthropogenic-era forcings


From the anthropogenic-era forcings summarized in Table 1, we obtain the first of the three factors –
ΔF2x ≈ 3.405 Wm–2. (4)

Continue to Part 2