The title should really be:
The Real Measure of Global Warming – Part Two – How Big Should Error Bars be, and the Sad Case of the Expendable Bathythermographs
But that was slightly too long.
This post picks up from The Real Measure of Global Warming which in turn followed Why Global Mean Surface Temperature Should be Relegated, Or Mostly Ignored
The discussion was about ocean heat content being a better measure of global warming than air temperature. However, ocean heat down into the deep has been less measured than air temperature, so is subject to more uncertainty the further back in time we travel.
We had finished up with a measure of changes in OHC (ocean heat content) over 50 years from Levitus (2005):
Some of the earlier graphs were a little small but you could probably see that the error bars further back in time are substantial. Unfortunately, it’s often the case that the error bars themselves are placed with too much confidence, and so it transpired here.
In 2006, GRL (Geophysical Research Letters) published the paper How much is the ocean really warming? by Gouretski and Koltermann.
They pointed out a significant error source in XBTs (expendable bathythermographs ). The XBT’s estimate temperature against depth by estimating depth from fall rate, a value which was found to be inaccurate.
The largest discrepancies are found between the expendable bathythermographs (XBT) and bottle and CTD data, with XBT temperatures being positively biased by 0.2–0.4C on average. Since the XBT data are the largest proportion of the dataset, this bias results in a significant World Ocean warming artefact when time periods before and after introduction of XBT are compared.
And conclude:
Comparison with LAB2005 [Levitus 2005] results shows that the estimates of global warming are rather sensitive to the data base and analysis method chosen, especially for the deep ocean layers with inadequate sampling. Clearly instrumental biases are an important issue and further studies to refine estimates of these biases and their impact on ocean heat content are required. Finally, our best estimate of the increase of the global ocean heat content between 1957–66 and 1987–96 is 12.8 ± 8.0 x 1022 J with the XBT offsets corrected. However, using only the CTD and bottle data reduces this estimate to 4.3 ± 8.0 x 1022 J.
If we refer back to Levitus, they had calculated a value over the same time period of 15×1022 J.
Gouretski and Koltermann are saying, in layman’s terms, if I might paraphrase:
Might be around what Levitus said, might be a lot less, might even be zero.. we don’t know.
Some readers might be asking, does this heretical stuff really get published?
Well, moving back to ocean heat content, we don’t want to drown in statistical analysis because anything more than a standard deviation and I am out of my depth, so to speak.. Better just to see what the various experts have concluded as our measure of uncertainty.
Ocean Heat Content is one of the hot topics, so no surprise to see others weighing in..
Domingues et al
In 2008, Nature then published Improved estimates of upper-ocean warming and multi-decadal sea-level rise by Domingues et al.
Remembering that the major problem of ocean heat content is first a lack of data, and now revealed, problematic data in the major data source.. Domingues says in the abstract:
..using statistical techniques that allow for sparse data coverage..
My brief excursion into statistics was quickly abandoned when the first paper cited (Reduced space optimal interpolation of historical marine sea level pressure: 1854-1992, Kaplan 2000) states:
..A novel procedure of covariance adjustment brought the results of the analysis to the consistency with the a priori assumptions on the signal covariance structure..
Let’s avoid the need for strong headache medication and just see their main points, interesting asides and conclusions. Which are interesting.
The black line is their story. Note their “error bars” in the top graph, the grey shading around the black line is one standard deviation. This helps us see “a measure” of uncertainty as we go back in time. The red line is the paper we have just considered, Levitus 2005.
Domingues calculates the 1961-2003 increase in OHC as 16 x1022 J, with their error bars as ±3 x1022 J. They calculate a number very close to Levitus (2005).
Interesting aside:
Climate models, however, do not reproduce the large decadal variability in globally averaged ocean heat content inferred from the sparse observational database.
From one of the papers they cite (Simulated and observed variability in ocean temperature and heat content, AchutaRao 2007) :
Several studies have reported that models may significantly underestimate the observed OHC variability, raising concerns about the reliability of detection and attribution findings.
And on to Levitus et al 2009
From GRL, Global ocean heat content 1955–2008 in light of recently revealed instrumentation problems
Or, having almost the last word with his updated paper:
The red line being the updated version, the black dotted line the old version.
Willis Back, 2006 and Forwards, 2009
In the meantime, Josh Willis, using the brand new Argo floats, (see part one for the Argo floats) published a paper (GRL 2006) showing such a sharp reduction in ocean heat from 2003 – 2005 that there was no explanation for.
And then a revised paper in 2009 in Journal of Atmospheric and Oceanic Technology showing that the previous correction was a mistake, instrument problems again.. now it’s all flat for a few years:
no significant warming or cooling is observed in upper-ocean heat content between 2003 and 2006
Probably more papers we could investigate, including one which I planned to cover before realizing I can’t find it and this post has gone on way too long already.
Conclusion
We are looking at a very important measurement, ocean heat content. We aren’t as sure as we would like to be about the history of OHC and not much can be done about that, although novel statistical methods of covariance adjustment may have their place.
Some could say, based on one of the papers presented here, “No ocean warming for 50 years”. It’s a possibility, but probably a distant one. One day when we get to the sea level “budget”, more usefully called “sea level rise”, we will probably think that the rise of sea level is usefully explained by the ocean heat content going up.
We do have excellent measurements in place now, and since around 2000, although even that exciting project has been confused by instrument uncertainty, or uncertainty about instrument uncertainty.
We have seen a great example that error bars aren’t really error bars. They are “statistics”, not real life.
And perhaps, most useful of all, we might have seen that papers which show “a lot less warming” and “unexplained cooling”, still make it into print with peer-reviewed science journals like GRL. This last factor may give us more confidence than anything that we are seeing real science in progress. And save us from having to analyze 310,000 temperature profiles with and without covariance adjustments. Instead, we can wait for the next few papers to see what the final consensus is.
Or spend a lifetime in study of statistics.
[…] Update – follow up post – The Real Measure of Global Warming – Part Two – How Big Should Error Bars be, and the Sad Case o… […]
I empathize . . . I’ve had to learn some statistics to read the literature in my field (medicine) but beyond a chi-squared test, I get vertigo. My ambitions are not great, but someday I would like to be able to duplicate that regression analysis thing all the statisticians do.
In medicine, as in climate science, it’s difficult to make any use of the raw data without statistical analysis. I avoided statistics in college — seemed dry. Mistake.
But enough about my inadequacies. Great post.
Interesting. It certainly seems as if oceans should be the focus of most efforts to determine planetary heat balance.
I remain concerned that the heretical papers are not making it into Nature and Science due to overwhelming groupthink influence in the climate cloister. The CRU emails and Phil Jones testimony that it is “common practise” to withhold data from other researchers fail to reassure on this point.
As for the third tier popular science journals such as Scientific American and New Scientist, as the “wise guys” say, fuggiduhbowdet! Pure one-sided propaganda at those locations.
Is there any paleoclimate proxy that could help reconstruct sea temperature over different depths and timescales?
scienceofdoom,
There is a new post over at WUWT on CO2 that you might want to read and comment on.
scienceofdoom,
Good post at WUWT.
Alexandre:
There are a few proxies but nothing that would give any kind of resolution that we need, or the accuracy either.
The proxies are things like the ratio of Cadmium:Calcium in bottom-dwelling foraminifera in sediments at the bottom of the ocean. And the ratio of 12C and 13C (carbon isotopes) in plankton.
But the reconstructions are more “a temperature indicator” than a temperature chart. And the time series is not so accurate either.
These are very useful for helping to reconstruct past climate, but not for any real measure of ocean heat content.
Good post. A useful summary of the state of play. I learned a few things.
The bottom line seems to be that there is a big range in the estimates and a long way to go before that range is likely to narrow significantly. The instrumentation limitations and sample size limitations for earlier data restrict our capacity to estimate the change.
PS The other thing we will never know is whether the increase over the last 50 years or so is in any way unusual.
And another thing: I calculate that 15E+22 Joules of heat is equivalent to raising the ocean temperature (top 700m) by about 0.37K. Does this mean that there is more heating to go or that eventually the air delta T and ocean delta T would have to equalize at a level somewhere in between the two? (Assuming no further increases in CO2, that is.)
Alex Heyworth:
I get a different number – 0.14’C, doesn’t matter too much for the question, but..
Surface area of ocean = 360×10^6 km^2 = 360×10^12 m^2
Volume of ocean in top 700m = 2.52×10^17 m^3
Mass of ocean = 2.5×10^20 kg
Heat capacity of ocean = 1.1×10^24 J/K
Temperature rise = 15×10^22/1.1×10^24 = 0.14’C
Well, the question is about the equilibrium temperature.
Are you thinking about the rule of thumb no-feedback temperature rise from the increase in CO2?
Radiative forcing at TOA leading to an approx 1.2’C temperature rise at the surface since pre-industrial times?
Interesting question.. Perhaps too many other questions relating to what the pre-industrial OHC was, what feedbacks, positive or negative..
But let’s suppose that we know that OHC has increased 15×10^22J since pre-industrial times and that the equilibrium temperature will end up 1.2’C higher than pre-industrial times (ie no positive or negative feedback or “natural variation”).
It’s a tough question even then.
We are thinking about a radiative balance. The surface of the earth needs to increase by 1.2’C to radiate enough energy to balance the “greenhouse” increase.
But how much heat ends up at the surface and how well-mixed is the ocean?
If my calculation above was accurate (0.14’C in the top 700m), then suppose we said that all the heat ends up in the top 100m, in that case the top 100m has increased by 1’C.
Hopefully you see the problem with answering the question.
Energy is radiated out by the very surface of the ocean. Temperature at the very surface of the ocean is determined by the whole OHC as well as the distribution of that heat which is affected by:
– global ocean currents driven by density (from temperature and salinity)
– surface mixing (from winds and waves)
– general thermal diffusivity of the ocean
Anyone know the answer?
Indeed I can see the problem, which is why I asked the question.
Re the calculation: I am with you up to the mass of the top 700m of the ocean. (I got 2.35E+20 kg due to having a different value for the ocean area). After that you seem to have skipped a step relating to the specific heat of water. (4.186 J/g/K according to the source I consulted). I don’t know what this translates to in terms of the heat capacity of the top 700m of ocean, as I didn’t tackle the calculation this way.
In your last step, you should be dividing the heat capacity by the number of joules, not the other way round.
Alex Heyworth:
On the calculation – aka the easy bit – if you divide the heat capacity by the number of joules, then that means as the number of joules goes up, the temperature rise goes down..
And as the heat capacity of a substance goes up – which means it needs more heat to raise it 1’C – the calculation your way will increase the temperature rise for a given amount of heat.
Look at the units:
Specific heat capacity of water is 4.2kJ kg-1 K-1, or 4200J kg-1 K-1 (which is also 4.2J/g/K)
So that means if you take heat capacity = specific heat capacity x mass, the units are J/K.
Heat capacity / Heat – look at the units in the result:
(J/K)/J = 1/K – so the result is 1/temperature.
Whereas as if you do it the old-fashioned way – Heat/Heat capacity the units are
J/(J/K) = K, so the result is in K, or temperature.
Hope this helps.
Q=mc.dT, Q is energy, m is mass, c is specific heat capacity, dT is change in temperature.
=> dT= Q/mc
On the hard question perhaps some people have tried to answer this question, in due course I will see if I can find anything.
If the oceans were static – no major ocean currents – I think it would be relatively easy to work out the answer. You could do a 1D analysis for a number of depths around the world and different amounts of radiation. This would just be a thermal problem.
Then you would have the profiles and the answer as to how much ocean heat before steady state was reached.
But of course, nothing is static. Freshwater runoff, ice melting and precipitation change the salinity. Wind, earth rotation and the shape of the continents induce different ocean movements at different depths. And temperature and salinity change the major circulations – the thermohaline current.
Not such a simple problem. Or not such a simple answer anyway.
PS, I get 0.15K doing it your way, with the ocean area figure I had. I seem to recall vaguely seeing this figure quoted.
“..A novel procedure of covariance adjustment brought the results of the analysis to the consistency with the a priori assumptions on the signal covariance structure..”
I stop reading when I hit the word “novel!”
Is there some statistical/mathematical reason that the trend line in the “Ocean heat change 1955- 2009 – Levitus (2009)” graph starts 15 years after the start of the data?
While it’s obvious the trend would still be up, it seems like that would exaggerate the trend by quite a bit!
insurgent
Very good question. Levitus says:
So running an earlier trend line would, in their opinion, reduce the confidence interval. It seems plausible at least.
You can see the error “shading” in the earlier graph by Domingues – much larger than any trend in the 1950-1970 period.
The trend line does coincide with land surface temperature measurement (and SST measurement) increases from 1970 onwards. Between 1940 and 1970 GMST was more or less flat.
People visiting this site should be very interested in the statistics even though it is a drag. Cointegration is required for determining valid relationships on temperature time series because of the presence of a “Unit Root” in that data (high persistency). Normal confidence interval calculations and trend lines also break down. Case in point, global temperature is now well underneath some of the projected temperature lower bounds.
A long discussion about time series analysis and unit root as it relates to temperature is here:
http://ourchangingclimate.wordpress.com/2010/03/01/global-average-temperature-increase-giss-hadcru-and-ncdc-compared
You will get the point about a quarter of the way into the blog. After seeing the last post by VS there is no need to read further.
For those not statistically inclined:
A trend line through temperature data is a lot like a trend line through an evening of casino winnings. It tells you correctly what happened, but has no predictive power.
Of course there is a physical process generating the observed temperatures. However, even simple non-linear systems can display chaotic behavior. See for a classic example: http://en.wikipedia.org/wiki/Lorenz_attractor.
So the task of extracting meaningful inferences from a very noisy signal with a unit root which represents a chaotic process is indeed very difficult. At a minimum, it requires state of the art statistics. [snip – please check the etiquette ].
Will
Reporting ocean heat content and changes in ocean heat content in units of 10^22 Joules is pretty meaningless to non-specialists like me. I tried to translate these units into temperature change using 4.18 J/g water/degK as the heat capacity of water and 335,258,000 sq km as the surface area of the ocean Unfortunately, the existence of the continental shelf means that I can’t properly calculate total volume of sea water down to 300, 700, 1000 or 3000 m, but this probably produces no worse than a two-fold error. If the sides of the continents were vertical, there would be 10^9 cu km = 10^24 cu cm = 10^24 g of water in the ocean down to 3000 m and 10^23 g water down to 300 m. It takes about 4 times as many Joules as grams of water to raise the temperature 1 degK. So 10^22 J appears to be enough energy to raise the top 300 m of the ocean 1/40-1/20 of a degK or the top 3000 m of the ocean 1/400-1/200 of a degK.
Are my calculations correct? Are scientists really drawing important conclusions from such tiny changes in temperature? Surely no one really believes that we have reliable ocean heat content data going back to 1950! The error bars on Figure 1 that represent sampling variability are bad enough and they don’t include systematic errors. If thousands of Argo buoys designed to measure ocean heat content are having a difficult time tracking changes in ocean heat content, why should anyone pay attention to this primitive data?
Frank
On the calculation:
Using your value of surface area and converting to m^2:
A= 3.4×10^14 m^2
The average depth of the ocean around the world is about 4km=4000m.
Volume of ocean = 1.36×10^18 m^3
Density of water = 1000 kg/m^3 (approx)
Mass of ocean = 1.36×10^21 kg
And specific heat capacity, c = 4200 J / kg.K
And dT= Q/mc, where dT = temp change and Q=energy
For Q = 10^22 J:
dT = 0.002’C – so you are correct,
and for the top 300m:
dT = 0.002 * 4000/300 = 0.02’C
So over 50 years, dT = 1’C (if all the heat is in the top 300m)
On your comments
Clearly there are measurement issues, but it’s an important subject.
The error bars are big going back, but that’s the data available.
If no one tried to measure it then no one would know whether it was 1’C or 0.1’C or 5’C.
At least now a few scientists have tried to put a value on the problem and the rest of us can decide how useful it is.
Based on the information in your post and response, we are looking at an increase in ocean heat content of 10-15*10^22 J in the top 300, 700 and 3000 m of the ocean, mostly over the past 40 years. Using the upper figure, this corresponds to a 40-YEAR CHANGE of 0.38 degC in the top 300 m; 0.16 degC change in the top 700 m; and an absurdly small 0.038 degC change in the top 3000m.
The uncertainty in measurements covering the top 3000 m is ridiculously large. The gray error bars should be at least 5 times bigger for 3000 m than for 700 m, with one standard deviation – half of the 95% confidence interval – covering the full vertical scale of any graph. Can you post a graph of ocean heat content down to 3000 m WITH ERROR BARS? Why should anyone pay the slightest attention to the 3000 m data?
The problem with data for the top 300 m is that the temperature of the top 100 m varies with the season, the local weather, and the strength of the wind (which mixes the top layer). Other than sea surface temperature, we probably don’t have historical data with the temporal resolution to accurately track the change in temperature with depth in the top 100 m. Therefore even though we may have the ability to accurately measure a 0.38 degK change in the top 300 m, this change is occurring against a background of high natural variability in the top 100 m.
Uncertainty is probably the reason the majority of analyses track the heat content in the top 700 m of the ocean. Still the 40-year change is only a total of 0.16 degK. Most of this data was collected before global warming became a major scientific concern, possibly with equipment and procedures that weren’t designed to provide the precision needed for today’s needs.
Until we have 10-20 years of data from the Argo buoys, our understanding of energy flux from the surface into the deep oceans is completely inadequate. Developers of GCM’s don’t have in observational evidence (energy flux into the top 300 m, then the top 700 m, then the top 3000 m) to demonstrate that their models accurately reproduce “thermal diffusivity”. Thermal diffusivity is a critical parameter of GCMs that has a major impact on climate sensitivity and whether global temperature lags radiative forcing by a few years or by a few decades.
Will Kernkamp,
While the presence of a unit root would indeed imply that regular OLS regression is strictly speaking not valid (the error bounds will be underestimated, though the central estimate of the slope is probably not affected much), your comparison with ‘an evening of casino winnings’ misses the mark.
Exactly because, as you also say, the temp’s are governed by physical processes (i.e. the planetary energy balance and internal modes of variability such as ENSO).
A better analogy would be if a timeseries of your body weight would contain a unit root. We all know that our body weight is governed by phsyical-biological processes (our personal ‘energy balance’). And if we eat more than our body needs, we’ll gain weight, irrespective of the presence of a unit root.
See also http://ourchangingclimate.wordpress.com/2010/04/01/a-rooty-solution-to-my-weight-gain-problem/
Of course, that doesn’t negate the need for appropriate statistics in analyzing time series.
Science of doom
Your calculation of the volume of the top 700m of the ocean assumes that the ocean area is constant throughout the top 700m, whereas it obviously reduces as depth increases. (I have been trying to find the volume of the top 3000m of he ocean, for which this is a bigger issue.)
Also, the specific heat of seawater is about 3,990 J/K/kg, not 4180 – you are thinking of fresh water.