Wednesday, 13 April 2011

Staggering Drop In Global Temperature

WOW that's the biggest drop in global temperature EVER

Note: Pay attention to the green line in the graph, NOT the red line. The red line will deceive you into thinking the world is warming. It's only there for purposes of illustration later. Please ignore it until then.

The current temperature anomaly for March 2011 has just come in at -0.1C. That's MINUS 0.1C which is below the freezing point of water, so how can the arctic be melting?

The total temperature drop highlighted by the green line is almost as great as the entire warming that occurred since 1900! So much for global warming then! If this continues we will enter an ice age next year!

You might remember being told to focus on a similar drop and minus numbers in 2008. You might even remember how important that event was a sign of things to come. The eminent Joe Bastardi at the time described the 2007-2008 drop as "straight out of the book of climate. The pattern is so much like the 1949-1950 La Nina, which was signaling the start of the reversal of the warming of the earth’s climate in the 1930s, ‘40s and early 50s. Only someone choosing to ignore it, or not wanting to see it, would not be cognizant of it."

I couldn't find the book of climate at the library to confirm his claim, but it certainly rings true with what I want to believe. Besides when heavy weights like Piers Corbyn and David Archibald are predicting the world will cool in coming decades, who can argue?

Anyway needless to say Bastardi was right. Temperature just kept falling after 2008 and now it's falling all over again! There was a briefly mild kind of slight flattish temperature bump in 2010, but that was just noise caused by an El Nino and entirely predictable. The current temperature drop however coincides with a La Nina and the last thing climate scientists expected to happen during a La Nina was for global temperature to fall! The cause of the drop is probably because the oceans have turned upside down due to something we experts call the PDO oscillation, which basically means an ice age is coming.

The Side Bar: Trends and Noise

The side bar is a feature I use to stroke my ego teach my readers the "dos" and "donts" of science. Previous side bars have explained precisely why exponentials should not be used and why data should never be plotted

In this side bar I want to explain why the use of noise is preferable to longterm trends for predicting the future.

The use of trends and noise is one of the main sources of statistical disagreement between deniers and alarmists. We of course consult professional statisticians whereas climate scientists don't. In their typical deceptive style warmists insist you should focus on irrelevant long term trends in data rather than short term 'noise'. For example in the graph above a long term trend is depicted by the red arrow, whereas they would insist the green line was just noise. Deniers like me point out that noise is more important than trends, after-all the green line is steeper than the red line and also going in the right direction.

But perhaps the fairest way to arbitrate a scientific disagreement and build some bridges towards reconciliation at the same time is to check what God thinks. In this case God thinks the warmists are wrong, for if God didn't want our attention to be drawn to short term noises why did he build ears on the sides of our heads?

Religious based facts aside, I can prove noise is the most important part of the data. If we ignored the noise in the temperature record and just focused on the longterm trend we would, of all preposterous things, be forced to conclude the world was warming.

Sunday, 3 April 2011

The IPCC Forecast Is Simply Wrong

If you have paid attention to the recent congressional hearing on Climate-Gate you will no doubt have heard that forecasting guru Dr. J. Scott Armstrong has proven the IPCC models are outperformed by a simple model

Armstrong argues:
Those involved in the global warming alarm have violated the “simple methods” principle.
He recommends that:
"To help ensure objectivity, government funding should not be provided for climate-change forecasting. As we have noted, simple methods are appropriate for forecasting for climate change. Large budgets are therefore not necessary."
If you doubt Dr Armstrong is a forecasting guru check the testimony:
Dr Armstrong ... is the author of Long-range Forecasting, the creator of forecastingprinciples.com, and editor of Principles of Forecasting (Kluwer 2001), an evidence-based summary of knowledge on forecasting methods. He is a founder of the Journal of Forecasting, the International Journal of Forecasting, and the International Symposium on Forecasting. He has spent 50 years doing research and consulting on forecasting
So yes he's very much involved in forecasting.
We conducted a validation test of the IPCC forecasts based on the assumption that there would be no interventions. This test found that the errors for IPCC model long-term forecasts (91 to 100 years in the future) were 12.6 times larger than those from an evidence-based “no change” model. Based on our analyses, we concluded that the global warming alarm is an anti-scientific political movement.
This is music to my ears, and the ears to other deniers the Internet wide. At last we have scientific sounding justification for our claims that the experts know less than simple folk. We can figure it out ourselves. Oh they might have fancy equations and computers but what really counts is wild ass guesses from those willing to think out of their armchairs.

The conclusion I like to draw is that simple models always work better than more complex models. Sounds right to me. And of course Armstrong is right, he was after-all the first man on the moon.

Glowing recommendations abound. Noone quite understands what Armstrong did, but we share absolute conviction that he's justified our basic dogma:
"I have not heard any testimony but am under the impression Scott Armstrong knows a great deal about complex modeling and has rejected it as failed (at least long term modeling)" - blog comment
An Analysis of Armstrong's validation test of the IPCC forecasting model

But unlike other denier blogs lets go a bit further and actually try to understand what Armstrong did to demonstrate a simple model beats the IPCC models at making longterm forecasts. This is a technical blog afterall.

The validation test Armstrong performed is detailed in his 2009 paper, Validity of climate change forecasting for public policy decision making, co-authored by Willie Soon and published in the International Journal of Forecasting (wait where have I heard of that before?).

What Armstrong did was to use discredited global temperature data published by the university at the center of Climate Gate. But in this case we can trust the data because it leads to a conclusion we want to believe.

Hadcrut3, the temperature data used to test IPCC model and simple benchmark model forecasts.

Armstrong made a simple benchmark model that forecasts temperature. It is very simple, it just predicts that future temperature will be identical to today's. So his simple benchmark model's 100 year forecast starting from 1851 predicts that the 1951 temperature anomaly will be the same as the 1851 temperature anomaly.

Because forecasting single annual anomalies is exactly the kind of thing the IPCC does.

Armstrong first tested his benchmark model against IPCC forecasts made in 1992. Unfortunately this way he could only test a 17 year forecast made by the IPCC and he noted that policymakers were more interested in long-term forecasts (eg more like 100 years ahead, not 17):
"Policymakers are concerned with long-term climate forecasting, and the ex ante analysis we have described was limited to a small sample of short-horizon projections. To address this limitation, we calculated rolling projections from 1851 to illustrate a proper validation procedure."

What he really wanted to be able to do was to test something like a 100 year IPCC forecast made in 1851 against the forecast made by his benchmark model. But just how could he obtain 100 year IPCC forecasts made in 1851 when the IPCC didn't even exist in 1851? Armstrong found a simple solution:

Dangerous manmade global warming became an issue of public concern after NASA scientist James Hansen testified on the subject to the US Congress on June 23, 1988 (McKibben, 2007), after a 13-year period from 1975 over which global temperature estimates were up more than they were down. The IPCC (2007) authors explained, however, that “Global atmospheric concentrations of carbon dioxide, methane and nitrous oxide have increased markedly as a result of human activities since 1750” (p. 2). There have even been claims that human activity has been causing global warming for at least 5000 years (Bergquist, 2008).

It is not unreasonable, then, to suppose, for the purposes of our validation illustration, that scientists in 1850 had noticed that the increasing industrialization of the world was resulting in an exponential growth in “greenhouse gases”, and projected that this would lead to global warming of 0.03 C per year.

Yes that's right, the IPCC didn't exist in 1851, but we can always imagine what they would have said if they had existed in 1851. After-all it isn't like the 0.03C per year warming rate is based on a complicated model. The IPCC models are simple right? 0.03C/year, wherever that comes from, is clearly based on nothing more than the notion that temperature will go up. 0.3C per decade is just a kind of universal warming rate that any IPCC scientist will eventually fixate on, even if that IPCC scientist exists in 1851.

The alternative to making it up would have been to take GCM hindcasts and compare them to HadCRUT. But that's quite involved. The idea here is to take the simpler route. It's simpler just to make shit up. That's one of the principles of forecasting in fact - make shit up.

So now lets compare the simple benchmark forecast with the IPCC forecast. At 0.03C warming per year the 1851 IPCC would have predicted the hadcrut 1951 temperature anomaly to be +2.7C, compared to the actual anomaly of -0.17C. Armstrong's simple benchmark model performs much better, predicting a 1951 temperature anomaly of -0.3C.


The absolute mean error in this case for Armstrong's model is 0.13C error. For the IPCC model it's a massive 2.87C error. The 1851 IPCC loses.

So when you next hear that simple models perform better at forecasting than complex IPCC climate models, now you know the technical details behind that fact. Thank god someone with the competence of Armstrong was brought in to testify before congress on such an important issue.