In many debates on whether the earth has been cooling this decade we often hear
This decade is the warmest on record
(Note: reference is to the “naughties” decade).
This post isn’t about whether or not the temperature has gone up or down but just to draw attention to a subject that you would expect climate scientists and their marketing departments to handle better.
An Economic Analogy
Analogies don’t prove anything, but they can be useful illustrations, especially for those whose heads start to spin as soon as statistics are mentioned.
Suppose that the nineties were a roaring decade of economic progress, as measured by the GDP of industrialized nations (and ignoring all problems relating to what that all means). And suppose that the last half century with a few ups and downs had been one of strong economic progress.
Now suppose that around the start of the new millennium the industrialized nations fell into a mild recession and it dragged on for the best part of the decade. Towards the end of the decade a debate starts up amongst politicians about whether we are in recession or not.
There would be various statistics put forward, and of these the politicians out of power would favor the indicators that showed how bad things were. The politicians in power would favor the indicators that showed how good things were, or at least “the first signs of economic spring”.
Suppose in this debate some serious economists stood up and said,
But listen everyone, this decade has the highest GDP of any decade since records began.
What would we all think of these economists?
The progress that had taken the world to the start of the millennium would be the reason for the high GDP in the “naughties” decade. It doesn’t mean there isn’t a recession. In fact, it tells you almost nothing about the last few years. Why would these economists be bringing it up unless they didn’t understand “Economics 101”?
GDP and other measures of economic prosperity have a property that they share with the world’s temperature. The status at the end of this year depends in large part on the status at the end of last year.
In economics we can all see how this works. Prosperity is stored up year after year within the economic system. Even if some are spending like crazy others are making money as a result. When hard times come we don’t suddenly reappear, in economic terms, in 1935.
In climate it’s because the earth’s climate system stores energy. This is primarily the oceans and cryosphere (ice) but also includes the atmosphere.
Auto-Correlation for the total layman/woman who doesn’t want to hear about statistics
For those not statistically inclined, don’t worry this isn’t a technical treatment.
When various people analyze the temperature series for the last few decades they usually try and work out some kind of trend line and also other kinds of statistical treatments like “standard deviation”.
You can find lots of these on the web. I’m probably in a small minority but I don’t see the point of most of them. More on this at Is the climate more than weather? Is weather just noise?
However, for those who do see the point and carry out these analyses to prove or disprove that the world is warming or cooling in a “statistically significant” way, the more statistically inclined will be sure to mention one point. Because the temperature from year to year is related strongly to the immediate past – or in technical language “auto-correlated” – this changes the maths and widens the error bars.
Auto-correlation in layman’s terms is what I described in the economic analogy. Next year depends in large part on what happened last year.
Why mention this?
First, a slightly longer explanation of auto-correlation – skip that section if you are not interested..
Auto-Correlation in a little more detail
If you ever read anything about statistics you would have read about “the coin toss”.
I toss a coin – it’s 50/50 whether it comes up heads or tails. I have one here, flipping.. catching.. ok, trust me it’s heads.
Now I’m going to toss the coin again. What are the odds of heads or tails? Still 50/50. Ok, tossing.. heads again.
Now I’m going to toss the coin a 3rd time. At this point you check the coin and get it scientifically analyzed. Finally, much poorer, you hand me back the coin because it’s been independently verified as a “normal coin”. Ok so I toss the coin a 3rd time and it’s still 50/50 whether it lands heads or tails.
Many people who have never been introduced to statistics – like all the people who play roulette for real money that matters to them – have no concept of independent statistical events.
It’s a simple concept. What happened previously to the coin when I flipped it has absolutely no effect on a future toss of the coin. The coin has no memory. The law of averages doesn’t change the future. If I have tossed 10 heads in a row the next toss of this standard coin is no more likely to be tails than heads.
In statistics, the first kind of problems that are covered are ones where each event or each measurement are “independent”. Like the coin toss. This makes analysis of calculation of the mean (average) and standard deviation (how spread out the results are) quite simple.
Once a measurement or event is dependent in some way on the last reading (or an earlier reading) it gets much more complicated.
In technical language: Autocorrelation is the correlation of a signal with itself
If you want to assess a series of temperature measurements and work out a trend line and statistical significance of the results you need to take account of its auto-correlation.
What’s the Point?
What motivated this post was watching the behavior of some climate scientists, or at least their marketing departments. You can see them jump into many debates to point out that the error bars aren’t big enough on a particular graph, with a sad shake of their head as if to say “why aren’t people better at stats? why do we have to keep explaining the basics? you have to use an ARMA(1,1) process..”
But the same people, in debates about current cooling or warming, keep repeating
This decade IS the warmest decade on record
as if they hadn’t heard the first thing about auto-correlation.
Statistically minded climate scientists, like our mythical economists earlier, should be the last people to make that statement. And they should be the first to be coughing slightly and putting up a hand when others make that point in the context of whether the current decade is warming or cooling.
Conclusion
Figuring out whether the current decade is cooling or warming isn’t as easy as it might seem and isn’t the subject of this post.
But next time someone tells you “This decade IS the warmest decade on record” – which means in the last 150 years, or a drop in the geological ocean – remember that it is true, but doesn’t actually answer the question of whether the last 10 years have seen warming or cooling.
And if they are someone who appears to know statistics, you have to wonder. Are they trying to fool you?
After all, if they know what auto-correlation is there’s no excuse.
Thanks again for an excellent post. The economics analogy is very helpful I think. The more analogies we can find the better, as it helps to educate the public in these basic concepts.
I think that phrases such as “warmest decade on record” are an important part of the public’s perception of the AGW hypothesis.
In the spirit of this blog, I don’t wish to dwell on the psychology, but I do think that these misleading messages that the media keep pumping out are a major player in the AGW alarmism movement.
For global temperature data, one can reduce autocorrelation by averaging monthly data into yearly data. Does it not follow that averaging over a whole decade reduces autocorrelation more significantly?
I thought that these announcements were made because global climate is a (multi) decadal phenomenon, and decadal comparisons were a better, if imperfect indicator of global climate change.
Not so much. An annual average removes most of the seasonal variation, which can also be done by a seasonal adjustment. However, simply averaging over a longer period like five or ten years doesn’t help all that much, IIRC, especially when there’s an underlying long term trend. If weather is chaotic and climate is actually a construct, all bets are off.
Barry wrote: “For global temperature data, one can reduce autocorrelation by averaging monthly data into yearly data. Does it not follow that averaging over a whole decade reduces autocorrelation more significantly?”
Since most noise in GMST is due to ENSO – fluctuations that last 6-12 months and are usually split over two calendar years – the noise in annual temperature data appears random, not highly correlated like noise in monthly data. Decadal averaging does help minimize the impact of the RANDOM noise that remains in annual averages.
Comparing two decadal average temperatures is simply a crude way of representing the trend over two decades. With a long term trend over the last half century of 0.19 K/decade and the typical chaotic variation in temperature we have observed for many decades, there is a very high probability each decade will be warmer than the previous one. Before mid-century – when radiative forcing increased more slowly – decadal average temperature didn’t increase regularly. Rising decadal average temperature is a convenient way for the IPCC to convey a simple message about a complicated subject.
Drawing INFERENCES (meaning or significance) from the observed increase in mean decadal temperature is a complicated subject. One of my favorite references is a 1991 talk/paper by Lorenz: “Chaos, Spontaneous Climatic Variation, and the Detection of the [Enhanced] Greenhouse Effect”. We have the discoverer of chaos telling his peers what they need to do correctly draw inferences about changes in average decadal temperature.
“Certainly no observations have told us that decadal-mean temperatures are nearly constant under constant external circumstances.”
This is why climate models, and not just observations, are used to detect AGW.
Click to access Chaos_spontaneous_greenhouse_1991.pdf