Feeds:
Posts
Comments

In Impacts – VIII – Sea level 3 – USA I suggested this conclusion:

So the cost of sea level rise for 2100 in the US seems to be a close to zero cost problem.

Probably the provocative way I wrote the conclusion confused some people. I should have said that it was a very expensive problem. But that it wasn’t a problem that society should pay for, given that anyone moving to the coast since 2005 at the latest would have known that future sea level was considered to be a major problem. By 2100 the youngest people still living right on the sea front, who bought property there before 2005, would be at least 115 years old.

The idea is that “externalities” as economists call them should be paid by the creators of the problem, not the people that incur the problem. In this case, the “victims” are people who ignored the evidence and moved to the coast anyway. Are they still victims? That was my point.

Well, what about outside the US?

Some mega cities have huge problems. Here is Nicholls 2011:

Coastal areas constitute important habitats, and they contain a large and growing population, much of it located in economic centers such as London, New York, Tokyo, Shanghai, Mumbai, and Lagos. The range of coastal hazards includes climate-induced sea level rise, a long-term threat that demands broad response.

Global sea levels rose 17 cm through the twentieth century, and are likely to rise more rapidly through the twenty-first century when a rise of more than 1 m is possible.

In some locations, these changes may be exacerbated by

(1) increases in storminess due to climate change, although this scenario is less certain
(2) widespread human-induced subsidence due to ground fluid withdrawal from, and drainage of, susceptible soils, especially in deltas.

Subsidence?

Over the twentieth century, the parts of Tokyo and Osaka built on deltaic areas subsided up to 5 m and 3 m, respectively, a large part of Shanghai subsided up to 3 m, and Bangkok subsided up to 2 m.

This human-induced subsidence can be mitigated by stopping shallow, subsurface fluid withdrawals and managing water levels, but natural “background” rates of subsidence will continue, and RSLR will still exceed global trends in these areas. A combination of policies to mitigate subsidence has been instituted in the four delta cities mentioned above, combined with improved flood defenses and pumped drainage systems designed to avoid submergence and/ or frequent flooding.

In contrast, Jakarta and Metro Manila are subsiding significantly, with maximum subsidence of 4 m and 1 m to date, respectively (e.g., Rodolfo and Siringan, 2006; Ward et al., 2011), but little systematic policy response is in place in either city, and future flooding problems are anticipated.

Subsidence graphic:

From Nicholls 2011

Figure 1

To put these figures in context, sea level rise from 1900-2000 was about 0.2m and according to the latest IPCC report the forecast of sea level rise by 2100 might be around an additional 0.5m (for RCP 6.0, see earlier article). In the light of the idea that global society should pay for problems to people caused by global society, perhaps the problems of Shanghai, Bangkok and other sinking cities are not global problems?

Here is Wang et al from 2012:

Shanghai is low-lying, with an elevation of 3–4 m. A quarter of the area lies below 3 m. The city’s flood-control walls are currently more than 6 m high. However, given the trend of sea level rise and land subsidence, this is inadequate. Shanghai is frequently affected by extreme tropical storm surges. The risk of flooding from overtopping is considerable..

..From 1921 to 1965, the average cumulative subsidence of the city center was 1.76 m, with a maximum of 2.63 m. From 1966 to 1985, a monitoring network was established and subsidence was mitigated through artificial recharge. Land subsidence was stabilized at an average of 0.9 mm/year. As a result of rapid urban development and large-scale construction projects between 1986 and 1997, subsidence of the downtown area increased rapidly, at an average rate of 10.2 mm/year..

..In 2100, sea level rise and land subsidence will be far greater than before. Sea level rise is estimated to be 43 cm, while land subsidence is estimated to be 3–229 cm, and neotectonic subsidence is estimated to be 14 cm. Flooding will be severe in 2100 (Fig. 8).

[Note I changed the data in the last paragraph cited to round numbers in cm from their values quoted to 0.01cm – for example, 43cm instead of the paper’s values of 43.31 etc].

So for Shanghai at least global sea level rise is not really the problem.

Given that I don’t pay much attention to media outlets I probably missed the big Marches against Ground Water Depletion Slightly Accentuating Global Warming’s Sea Level Rise in Threatened Megacities.

As with the USA data the question of increased storm surges accentuating global sea level rise is still on the agenda (i.e., has not yet been discussed).

References

Planning for the impacts of sea level rise, RJ Nicholls, Oceanography (2011)

Evaluation of the combined risk of sea level rise, land subsidence, and storm surges on the coastal areas of Shanghai, China, Jun Wang, Wei Gao, Shiyuan Xu & Lizhong Yu, Climatic Change (2012)

 

[I started writing this some time ago and got side-tracked, initially because aerosol interaction in clouds and rainfall is quite fascinating with lots of current research and then because there are many papers on higher resolution simulations of convection that also looked interesting.. so decided to post it less than complete because it will be some time before I can put together a more complete article..]

In Part Four of this series we looked at the paper by Mauritsen et al (2012). Isaac Held has a very interesting post on his blog – and people interested in understanding climate science will benefit from reading his blog – he has been in the field writing papers for 40 years). He highlighted this paper: Cloud tuning in a coupled climate model: Impact on 20th century warming, Jean-Christophe Golaz, Larry W. Horowitz, and Hiram Levy II, GRL (2013).

Their paper has many similarities to Mauritsen et al (2013). Here are some of their comments:

Climate models incorporate a number of adjustable parameters in their cloud formulations. They arise from uncertainties in cloud processes. These parameters are tuned to achieve a desired radiation balance and to best reproduce the observed climate. A given radiation balance can be achieved by multiple combinations of parameters. We investigate the impact of cloud tuning in the CMIP5 GFDL CM3 coupled climate model by constructing two alternate configurations.

They achieve the desired radiation balance using different, but plausible, combinations of parameters. The present-day climate is nearly indistinguishable among all configurations.

However, the magnitude of the aerosol indirect effects differs by as much as 1.2 W/m², resulting in significantly different temperature evolution over the 20th century..

 

..Uncertainties that arise from interactions between aerosols and clouds have received considerable attention due to their potential to offset a portion of the warming from greenhouse gases. These interactions are usually categorized into first indirect effect (“cloud albedo effect”; Twomey [1974]) and second indirect effect (“cloud lifetime effect”; Albrecht [1989]).

Modeling studies have shown large spreads in the magnitudes of these effects [e.g., Quaas et al., 2009]. CM3 [Donner et al., 2011] is the first Geophysical Fluid Dynamics Laboratory (GFDL) coupled climate model to represent indirect effects.

As in other models, the representation in CM3 is fraught with uncertainties. In particular, adjustable cloud parameters used for the purpose of tuning the model radiation can also have a significant impact on aerosol effects [Golaz et al., 2011]. We extend this previous study by specifically investigating the impact that cloud tuning choices in CM3 have on the simulated 20th century warming.

What did they do?

They adjusted the “autoconversion threshold radius”, which controls when water droplets turn into rain.

Autoconversion converts cloud water to rain. The conversion occurs once the mean cloud droplet radius exceeds rthresh. Larger rthresh delays the formation of rain and increases cloudiness.

The default in CM3 was 8.2 􏰃μm. They selected alternate values from other GFDL models: 6.0 􏰃μm (CM3w) and 10.6 􏰃μm (CM3c). Of course, they have to then adjust others parameters to achieve radiation balance – the “erosion time” (lateral mixing effect reducing water in clouds) which they note is poorly constrained (that is, we don’t have some external knowledge of the correct value for this parameter) and the “velocity variance” which affects how quickly water vapor condenses out onto aerosols.

Here is the time evolution in the three models (and also observations):

 

From Golaz et al 2013

From Golaz et al 2013

Figure 1 – Click to enlarge

In terms of present day climatology, the three variants are very close, but in terms of 20th century warming two variants are very different and only CM3w is close to observations.

Here is their conclusion, well worth studying. I reproduce it in full:

CM3w predicts the most realistic 20th century warming. However, this is achieved with a small and less desirable threshold radius of 6.0 􏰃μm for the onset of precipitation.

Conversely, CM3c uses a more desirable value of 10.6 􏰃μm but produces a very unrealistic 20th century temperature evolution. This might indicate the presence of compensating model errors. Recent advances in the use of satellite observations to evaluate warm rain processes [Suzuki et al., 2011; Wang et al., 2012] might help understand the nature of these compensating errors.

CM3 was not explicitly tuned to match the 20th temperature record.

However, our findings indicate that uncertainties in cloud processes permit a large range of solutions for the predicted warming. We do not believe this to be a peculiarity of the CM3 model.

Indeed, numerous previous studies have documented a strong sensitivity of the radiative forcing from aerosol indirect effects to details of warm rain cloud processes [e.g., Rotstayn, 2000; Menon et al., 2002; Posselt and Lohmann, 2009; Wang et al., 2012].

Furthermore, in order to predict a realistic evolution of the 20th century, models must balance radiative forcing and climate sensitivity, resulting in a well-documented inverse correlation between forcing and sensitivity [Schwartz et al., 2007; Kiehl, 2007; Andrews et al., 2012].

This inverse correlation is consistent with an intercomparison-driven model selection process in which “climate models’ ability to simulate the 20th century temperature increase with fidelity has become something of a show-stopper as a model unable to reproduce the 20th century would probably not see publication” [Mauritsen et al., 2012].

Very interesting paper, and freely availableKiehl’s paper, referenced in the conclusion, is also well-worth reading. In his paper he shows that models with the highest sensitivity to GHGs have the highest negative value from 20th century aerosols, while the models with the lowest sensitivity to GHGs have the lowest negative value from 20th century aerosols. Therefore, both ends of the range can reproduce 20th century temperature anomalies, while suggesting very different 21st century temperature evolution.

A paper on higher resolution models, Siefert et al 2015, did some model experiments, “large eddy simulations”, which are much higher resolution than GCMs. The best resolution GCMs today typically have a grid size around 100km x 100km. Their LES model had a grid size of 25m x 25m, with 2048 x 2048 x 200 grid points, to span a simulated volume of 51.2 km x 51.2 km x 5 km, and ran for a simulated 60hr time span.

They had this to say about the aerosol indirect effect:

It has also been conjectured that changes in CCN might influence cloud macrostructure. Most prominently, Albrecht [1989] argued that processes which reduce the average size of cloud droplets would retard and reduce the rain formation in clouds, resulting in longer-lived clouds. Longer living clouds would increase cloud cover and reflect even more sunlight, further cooling the atmosphere and surface. This type of aerosol-cloud interaction is often called a lifetime effect. Like the Twomey effect, the idea that smaller particles will form rain less readily is based on sound physical principles.

Given this strong foundation, it is somewhat surprising that empirical evidence for aerosol impacts on cloud macrophysics is so thin.

Twenty-five years after Albrecht’s paper, the observational evidence for a lifetime effect in the marine cloud regimes for which it was postulated is limited and contradictory. Boucher et al. [2013] who assess the current level of our understanding, identify only one study, by Yuan et al. [2011], which provides observational evidence consistent with a lifetime effect. In that study a natural experiment, outgassing of SO2 by the Kilauea volcano is used to study the effect of a changing aerosol environment on cloud macrophysical processes.

But even in this case, the interpretation of the results are not without ambiguity, as precipitation affects both the outgassing aerosol precursors and their lifetime. Observational studies of ship tracks provide another inadvertent experiment within which one could hope to identify lifetime effects [Conover, 1969; Durkee et al., 2000; Hobbs et al., 2000], but in many cases the opposite response of clouds to aerosol perturbations is observed: some observations [Christensen and Stephens, 2011; Chen et al., 2012] are consistent with more efficient mixing of smaller cloud drops leading to more rapid cloud desiccation and shorter lifetimes.

Given the lack of observational evidence for a robust lifetime effect, it seems fair to question the viability of the underlying hypothesis.

In their paper they show a graphic of what their model produced, it’s not the main dish but interesting for the realism:

From Seifert et al 2015

From Seifert et al 2015

Figure 2 – Click to expand

It is an involved paper, but here is one of the conclusions, relevant for the topic at hand:

Our simulations suggest that parameterizations of cloud-aerosol effects in large-scale models are almost certain to overstate the impact of aerosols on cloud albedo and rain rate. The process understanding on which the parameterizations are based is applicable to isolated clouds in constant environments, but necessarily neglects interactions between clouds, precipitation, and circulations that, as we have shown, tend to buffer much of the impact of aerosol change.

For people new to parameterizations, a couple of articles that might be useful:

Conclusion

Climate models necessarily have some massive oversimplifications, as we can see from the large eddy simulation which has 25m x 25m grid cells while GCMs have 100km x 100km at best. Even LES models have simplifications – to get to direct numerical solution (DNS) of the equations for turbulent flow we would need a scale closer to a few millimeters rather than meters.

The over-simplifications in GCMs require “parameters” which are not actually intrinsic material properties, but are more an average of some part of a climate process over a large scale. (Note that even if we had the resolution for the actual fundamental physics we wouldn’t necessary know the material parameters necessary, especially in the case of aerosols which are heterogeneous in time and space).

As the climate changes will these “parameters” remain constant, or change as the climate changes?

References

Cloud tuning in a coupled climate model: Impact on 20th century warming, Jean-Christophe Golaz, Larry W. Horowitz, and Hiram Levy II, GRL (2013) – free paper

Twentieth century climate model response and climate sensitivity, Jeffrey T. Kiehl, GRL (2007) – free paper

Large-eddy simulation of the transient and near-equilibrium behavior of precipitating shallow convection, Axel Seifert et al, Journal of Advances in Modeling Earth Systems (2015) – free paper

In Parts VI and VII we looked at past and projected sea level rise. It is clear that the sea level has risen over the last hundred years, and it’s clear that with more warming sea level will rise some more. The uncertainties (given a specific global temperature increase) are more around how much more ice will melt than how much the ocean will expand (warmer water expands). Future sea level rise will clearly affect some people in the future, but very differently in different countries and regions. This article considers the US.

A month or two ago, via a link from a blog, I found a paper which revised upwards a current calculation (or average of such calculations) of damage due to sea level rise in 2100 in the US. Unfortunately I can’t find the paper, but essentially the idea was people would continue moving to the ocean in ever increasing numbers, and combined with possible 1m+ sea level rise (see Part VI & VII) the cost in the US would be around $1TR (I can’t remember the details but my memory tells me this paper concluded costs were 3x previous calculations due to this ever increasing population move to coastal areas – in any case, the exact numbers aren’t important).

Two examples that I could find (on global movement of people rather than just in the US), Nicholls 2011:

..This threatened population is growing significantly (McGranahan et al., 2007), and it will almost certainly increase in the coming decades, especially if the strong tendency for coastward migration continues..

And Anthoff et al 2010

Fifthly, building on the fourth point, FUND assumes that the pattern of coastal development persists and attracts future development. However, major disasters such as the landfall of hurricanes could trigger coastal abandonment, and hence have a profound influence on society’s future choices concerning coastal protection as the pattern of coastal occupancy might change radically.

A cycle of decline in some coastal areas is not inconceivable, especially in future worlds where capital is highly mobile and collective action is weaker. As the issue of sea-level rise is so widely known, disinvestment from coastal areas may even be triggered without disasters..

I was struck by the “trillion dollar problem” paper and the general issues highlighted in other papers. The future cost of sea level rise in the US is not just bad, it’s extremely expensive because people will keep moving to the ocean.

Why are people moving to the coast?

So here is an obvious take on the subject that doesn’t need an IAM (integrated assessment model).. Perhaps lots of people missed the IPCC TAR (third assessment report) in 2001. Perhaps anthropogenic global warming fears had not reached a lot of the population. Maybe it didn’t get a lot of media coverage. But surely no could have missed Al Gore’s movie. I mean, I missed it from choice, but how could anyone in rich countries not know about the discussion?

So anyone since 2006 (arbitrary line in the sand) who bought a house that is susceptible to sea level rise is responsible for their own loss that they incur around 2100. That is, if the worst fears about sea level rise play out, combined with more extreme storms (subject of a future article) which create larger ocean storm surges, their house won’t be worth much in 2100.

Now, barring large increases in life expectancy, anyone who bought a house in 2005 will almost certainly be dead in 2100. There will be a few unlucky centenarians.

Think of it as an estate tax. People who have expensive ocean-front houses will pass on their now worthless house to their children or grandchildren. Some people love the idea of estate taxes – in that case you have a positive. Some people hate the idea of estate taxes – in that case strike it up as a negative. And, drawing a long bow here, I suspect a positive correlation between concern about climate change and belief in the positive nature of estate taxes, so possibly it’s a win-win for many people.

Now onto infrastructure.

From time to time I’ve had to look at depreciation and official asset life for different kinds of infrastructure and I can’t remember seeing one for 100 years. 50 years maybe for civil structures. I’m definitely not an expert. That said, even if the “official depreciation” gives something a life of 50 years, much is still being used 150 years later – buildings, railways, and so on.

So some infrastructure very close to the ocean might have to be abandoned. But it will have had 100 years of useful life and that is pretty good in public accounting terms.

Why is anyone building housing, roads, power stations, public buildings, railways and airports in the US in locations that will possibly be affected by sea level rise in 2100? Maybe no one is.

So the cost of sea level rise for 2100 in the US seems to be a close to zero cost problem.

These days, if a particular area is recognized as a flood plain people are discouraged from building on it and no public infrastructure gets built there. It’s just common sense.

Some parts of New Orleans were already below sea level when Hurricane Katrina hit. Following that disaster, lots of people moved out of New Orleans to a safer suburb. Lots of people stayed. Their problems will surely get worse with a warmer climate and a higher sea level (and also if storms gets stronger – subject of a future article). But they already had a problem. Infrastructure was at or below sea level and sufficient care was not taken of their coastal defences.

A major problem that happens overnight, or over a year, is difficult to deal with. A problem 100 years from now that affects a tiny percentage of the land area of a country, even with a large percentage (relatively speaking) of population living there today, is a minor problem.

Perhaps the costs of recreating current threatened infrastructure a small distance inland are very high, and the existing infrastructure would in fact have lasted more than 100 years. In that case, people who believe Keynesian economics might find the economic stimulus to be a positive. People who don’t think Keynesian economics does anything (no multiplier effect) except increase taxes, or divert productive resources into less productive resources will find it be a negative. Once again, drawing a long bow, I see a correlation between people more concerned about climate change also being more likely to find Keynesian economics a positive. Perhaps again, there is a win-win.

In summary, given the huge length of time to prepare for it, US sea level rise seems like a minor planning inconvenience combined with an estate tax.

Articles in this Series

Impacts – I – Introduction

Impacts – II – GHG Emissions Projections: SRES and RCP

Impacts – III – Population in 2100

Impacts – IV – Temperature Projections and Probabilities

Impacts – V – Climate change is already causing worsening storms, floods and droughts

Impacts – VI – Sea Level Rise 1

Impacts – VII – Sea Level 2 – Uncertainty

References

Planning for the impacts of sea level rise, RJ Nicholls, Oceanography (2011)

The economic impact of substantial sea-level rise, David Anthoff et al, Mitig Adapt Strateg Glob Change (2010)

In Part VI we looked at past and projected sea level rise. There is significant uncertainty in future sea level rise, even assuming we know the future global temperature change. The uncertainty results from “how much ice will melt?”

We can be reasonably sure of sea level rise from thermal expansion (so long as we know the temperature). By contrast, we don’t have much confidence in the contribution from melting ice (on land). This is because ice sheet dynamics (glaciers, Greenland & Antarctic ice sheet) are non-linear and not well understood.

Here’s something surprising. Suppose you live in Virginia near the ocean. And suppose all of the Greenland ice sheet melted in a few years (not possible, but just suppose). How much would sea level change in Virginia? Hint: the entire Greenland ice sheet converted into global mean sea level is about 7m.

Zero change in Virginia.

Here are charts of relative sea level change across the globe for Greenland & West Antarctica, based on a 1mm/yr contribution from each location – click to expand:

From Tamisiea 2011

From Tamisiea 2011

Figure 1 – Click to Expand

We see that the sea level actually drops close to Greenland, stays constant around mid-northern latitudes in the Atlantic and rises in other locations. The reason is simple – the Greenland ice sheet is a local gravitational attractor and is “pulling the ocean up” towards Greenland. Once it is removed, the local sea level drops.

Uncertainty

If we knew for sure that the global mean temperature in 2100 would be +2ºC or +3ºC compared to today we would have a good idea in each case of the sea level rise from thermal expansion. But not much certainty on any rise from melting ice sheets.

Let’s consider someone thinking about the US for planning purposes. If the Greenland ice sheet contributes lots of melting ice, the sea level on the US Atlantic coast won’t be affected at all and the increase on the Pacific coast will be significantly less than the overall sea level rise. In this case, the big uncertainty in the magnitude of sea level rise is not much of a factor for most of the US.

If the West Antarctic ice sheet contributes lots of melting ice, the sea level on the east and west coasts of the US will be affected by more than the global mean sea level rise.

For example, imagine the sea level was expected to rise 0.3m from thermal expansion by 2100. But there is a fear that ice melting will cause 0 – 0.5m global rise. A US policymaker really needs to know which ice sheet will melt. The “we expect at most an additional 0.5m from melting ice” tells her that she might have – in total – a maximum sea level rise of 0.3m on the east coast and a little more than 0.3m on the west coast if Greenland melts; but she instead might have – in total – a maximum of almost 1m on each coast if West Antarctica melts.

The source of the melting ice just magnifies the uncertainty for policy and economics.

If this 10th century legend was still with us maybe it would be different (we only have his tweets):

Donaeld the Unready

Articles in this Series

Impacts – I – Introduction

Impacts – II – GHG Emissions Projections: SRES and RCP

Impacts – III – Population in 2100

Impacts – IV – Temperature Projections and Probabilities

Impacts – V – Climate change is already causing worsening storms, floods and droughts

Impacts – VI – Sea Level Rise 1

References

The moving boundaries of sea level change: Understanding the origins of geographic variability, ME Tamisiea & JX Mitrovica, Oceanography (2011)

In Part V we looked at the IPCC, an outlier organization, that claimed floods, droughts and storms had not changed in a measurable way globally in the last 50 -100 years (of course, some regions have seen increases and some have seen decreases, some decades have been bad, some decades have been good).

This puts them at a disadvantage compared with the overwhelming mass of NGOs, environmental groups, media outlets and various government departments who claim the opposite, but the contrarian in me found their research too interesting to ignore. Plus, they come with references to papers in respectable journals.

We haven’t looked at future projections of these events as yet. Whenever there are competing effects to create a result we can expect it to be difficult to calculate future effects. In contrast, one climate effect that we can be sure about is sea level. If the world warms, as it surely will with more GHGs, we can expect sea level to rise.

In my own mental list of “bad stuff to happen”, I had sea level rise as an obvious #1 or #2. But ideas and opinions need to be challenged and I had not really investigated the impacts.

The world is a big place and rising sea level will have different impacts in different places. Generally the media presentation on sea level is unrelentingly negative, probably following the impact of the impressive 2004 documentary directed by Roland Emmerich, and the dramatized adaption by Al Gore in 2006 (directed by Davis Guggenheim).

Let’s start by looking at some sea level basics.

Like everything else related to climate, getting an accurate global dataset on sea level is difficult – especially when we want consistency over decades.

The world is a big place and past climatological measurements were mostly focused on collecting local weather data for the country or region in question. Satellites started measuring climate globally in the late 1970s, but satellites for sea level and mass balance didn’t begin measurements until 10-20 years ago. So, climate scientists attempt to piece together disparate data systems, to reconcile them, and to match up the results with what climate models calculate – call it “a sea level budget”.

“The budget” means balancing two sides of the equation:

  • how has sea level changed year by year and decade by decade?
  • what contributions to sea level do we calculate from the effects of warming climate?

Components of Sea Level Rise

If we imagine sea level as the level in a large bathtub it is relatively simple conceptually. As the ocean warms the level rises for two reasons:

  • warmer water expands (increasing the volume of existing mass)
  • ice melts (adding mass)

The “material properties” of water are well known and not in doubt. With lots of measurements of ocean temperature around the globe we can be relatively sure of the expansion. Ocean temperature has increasing coverage over the last 100 years, especially since the Argo project that started a little more than 10 years ago. But if we go back 30 years we have a lot less measurements and usually only at the surface. If we go back 100 years we have less again. So there are questions and uncertainties.

Arctic ice melting has no impact on sea level because it is already floating. Water or ice that is already floating doesn’t change the sea level by melting/freezing. Ice on a continent that melts and runs into the ocean increases sea level due to increasing the mass. Some Antarctic ice shelves are in the ocean but are part of the Antarctic ice sheet that is supported by the continent of Antarctica – melt these ice sheets and they will add to ocean level.

Sea level over the last 100 years has increased by about 0.20m (about 8 inches, if we use advanced US units).

To put it into one perspective, 20,000 years ago the sea level was about 120m lower than today – this was the end of the last ice age. About 130,000 years ago the sea level was a few meters higher (no one is certain of the exact figure). This was the time of the last “interglacial” (called the Eemian interglacial).

If we melted all of Greenland’s ice sheet we would see a further 7m rise from today, and Greenland and Antarctica together would lead to a 70m rise. Over millennia (but not a century), the complete Greenland ice sheet melting is a possibility, but Antarctica is not (at around -30ºC, it is a very long way below freezing).

Complications

Why not use tide gauges to measure sea level rise? Some have been around for 100 years and a few have been around for 200 years.

There aren’t many tide gauges going back a long time, and anyway in many places the ground is moving relative to the ocean. Take Scandinavia. At the end of the last ice age Stockholm was buried under perhaps 2km of ice. No wonder Scandinavians today appear so cheerful – life is all about contrasts. As the ice melted, the load on the ground was removed and it is “springing back” into a pre-glacial position. So in many places around the globe the land is moving vertically relative to sea level.

In Nedre Gavle, Sweden, the land is moving up twice as fast as the average global sea level rise (so relative sea level is falling). In Galveston, Texas the land is moving down faster than sea level rise (more than doubling apparent sea level rise).

That is the first complication.

The second complication is due to wind and local density from salinity changes. So as an example, picture a constant sea level but Pacific winds change from W->E to E->W. The water will “pile up” in the west instead of the east, due to the force of the wind. Relative sea level will increase in the west and decrease in the east. Likewise, if the local density changes from melting ice (or ocean currents with different salinity) we can adjust the local sea level relative to the reference.

Here is AR5, chapter 3, p. 288:

Large-scale spatial patterns of sea level change are known to high precision only since 1993, when satellite altimetry became available.

These data have shown a persistent pattern of change since the early 1990s in the Pacific, with rates of rise in the Warm Pool of the western Pacific up to three times larger than those for GMSL, while rates over much of the eastern Pacific are near zero or negative.

The increasing sea level in the Warm Pool started shortly before the launch of TOPEX/Poseidon, and is caused by an intensification of the trade winds since the late 1980s that may be related to the Pacific Decadal Oscillation (PDO).

The lower rate of sea level rise since 1993 along the western coast of the United States has also been attributed to changes in the wind stress curl over the North Pacific associated with the PDO..

Measuring Systems

We can find a little about the new satellite systems in IPCC, AR5, chapter 3, p. 286:

Satellite radar altimeters in the 1970s and 1980s made the first nearly global observations of sea level, but these early measurements were highly uncertain and of short duration. The first precise record began with the launch of TOPEX/Poseidon (T/P) in 1992. This satellite and its successors (Jason-1, Jason-2) have provided continuous measurements of sea level variability at 10-day intervals between approximately ±66° latitude. Additional altimeters in different orbits (ERS-1, ERS-2, Envisat, Geosat Follow-on) have allowed for measurements up to ±82° latitude and at different temporal sampling (3 to 35 days), although these measurements are not as accurate as those from the T/P and Jason satellites.

Unlike tide gauges, altimetry measures sea level relative to a geodetic reference frame (classically a reference ellipsoid that coincides with the mean shape of the Earth, defined within a globally realized terrestrial reference frame) and thus will not be affected by VLM, although a small correction that depends on the area covered by the satellite (~0.3 mm yr–1) must be added to account for the change in location of the ocean bottom due to GIA relative to the reference frame of the satellite (Peltier, 2001; see also Section 13.1.2).

Tide gauges and satellite altimetry measure the combined effect of ocean warming and mass changes on ocean volume. Although variations in the density related to upper-ocean salinity changes cause regional changes in sea level, when globally averaged their effect on sea level rise is an order of magnitude or more smaller than thermal effects (Lowe and Gregory, 2006).

The thermal contribution to sea level can be calculated from in situ temperature measurements (Section 3.2). It has only been possible to directly measure the mass component of sea level since the launch of the Gravity Recovery and Climate Experiment (GRACE) in 2002 (Chambers et al., 2004). Before that, estimates were based either on estimates of glacier and ice sheet mass losses or using residuals between sea level measured by altimetry or tide gauges and estimates of the thermosteric component (e.g., Willis et al., 2004; Domingues et al., 2008), which allowed for the estimation of seasonal and interannual variations as well. GIA also causes a gravitational signal in GRACE data that must be removed in order to determine present-day mass changes; this correction is of the same order of magnitude as the expected trend and is still uncertain at the 30% level (Chambers et al., 2010).

The GRACE satellite lets us see how much ice has melted into the ocean. It’s not easy to calculate this otherwise.

The fourth assessment report from the IPCC in 2007 reported that sea level rise from the Antarctic ice sheet for the previous decade was between -0.3mm/yr and +0.5mm/yr. That is, without the new satellite measurements, it was very difficult to confirm whether Antarctica had been gaining or losing ice.

Historical Sea Level Rise

From AR5, chapter 3, p. 287:

From AR5, chapter 3

From AR5, chapter 3

Figure 1 – Click to expand

  • The top left graph shows that various researchers are fairly close in their calculations of overall sea level rise over the past 130 years
  • The bottom left graph shows that over the last 40 years the impact of melting ice has been more important than the expansion of a warmer ocean (“thermosteric component” = the effect of a warmer ocean expanding)
  • The bottom right graph shows that over the last 7 years the measurements are consistent – satellite measurement of sea level change matches the sum of mass loss (melting ice) plus an expanding ocean (the measurements from Argo turned into sea level rise).

This gives us the mean sea level. Remember that local winds, ocean currents and changes in salinity can change this trend locally.

Many people have written about the recent accelerating trends in sea level rise. Here is AR5 again, with a graph of the 18-year trend at each point in time. We can see that different researchers reach different conclusions and that the warming period in the first half of the 20th century created sea level rise comparable to today:

From AR5, chapter 3

From AR5, chapter 3

The conclusion in AR5:

It is virtually certain that globally averaged sea level has risen over the 20th century, with a very likely mean rate between 1900 and 2010 of 1.7 [1.5 to 1.9] mm/yr and 3.2 [2.8 and 3.6] mm/yr between 1993 and 2010.

This assessment is based on high agreement among multiple studies using different methods, and from independent observing systems (tide gauges and altimetry) since 1993.

It is likely that a rate comparable to that since 1993 occurred between 1920 and 1950, possibly due to a multi-decadal climate variation, as individual tide gauges around the world and all reconstructions of GMSL show increased rates of sea level rise during this period.

Forecast Future Sea Level Rise

AR5, chapter 13 is the place to find predictions of the future on sea level, p. 1140:

For the period 2081–2100, compared to 1986–2005, global mean sea level rise is likely (medium confidence) to be in the 5 to 95% range of projections from process-based models, which give:

  • 0.26 to 0.55 m for RCP2.6
  • 0.32 to 0.63 m for RCP4.5
  • 0.33 to 0.63 m for RCP6.0
  • 0.45 to 0.82 m for RCP8.5

For RCP8.5, the rise by 2100 is 0.52 to 0.98 m..

We have considered the evidence for higher projections and have concluded that there is currently insufficient evidence to evaluate the probability of specific levels above the assessed likely range. Based on current understanding, only the collapse of marine-based sectors of the Antarctic ice sheet, if initiated, could cause global mean sea level to rise substantially above the likely range during the 21st century.

This potential additional contribution cannot be precisely quantified but there is medium confidence that it would not exceed several tenths of a meter of sea level rise during the 21st century.

I highlighted RCP6.0 as this seems to correspond to past development pathways with little CO2 mitigation policies. No one knows the future, this is just my pick, barring major changes from the recent past.

In the next article we will consider impacts of future sea level rise in various regions.

Articles in this Series

Impacts – I – Introduction

Impacts – II – GHG Emissions Projections: SRES and RCP

Impacts – III – Population in 2100

Impacts – IV – Temperature Projections and Probabilities

Impacts – V – Climate change is already causing worsening storms, floods and droughts

References

Observations: Oceanic Climate Change and Sea Level. In: Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, NL Bindoff et al (2007)

Observations: Ocean. In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change, M Rhein et al (2013)

Sea Level Change. In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change, JA Church et al (2013)

I generally try and avoid the media as much as possible (although the 2016 Circus did suck me in) but it’s still impossible to miss claims like the following:

Climate change is already causing worsening storms, floods and droughts

Before looking at predictions for the future I thought it was worth reviewing this claim, seeing as it is so prevalent and is presented as being the current consensus of climate science.

Droughts

SREX 2012, p. 171:

There is medium confidence that since the 1950s some regions of the world have experienced more intense and longer droughts (e.g., southern Europe, west Africa) but also opposite trends exist in other regions (e.g., central North America, northwestern Australia).

The report cites Sheffield and Wood 2008 who show graphs on a variety of drought metrics from around the world over the last 50 years – click to enlarge:

From Sheffield & Wood 2008

From Sheffield & Wood 2008

Figure 1 – Click to enlarge

The results above were calculated from models based on available meteorological data. According to their analysis some places have experienced more droughts, and other places less droughts. Because they are based on models we can expect that alternative researchers may produce different results.

AR5, published a year after SREX, says, chapter 2, p. 214-215:

Because drought is a complex variable and can at best be incompletely represented by commonly used drought indices, discrepancies in the interpretation of changes can result. For example, Sheffield and Wood (2008) found decreasing trends in the duration, intensity and severity of drought globally. Conversely, Dai (2011a,b) found a general global increase in drought, although with substantial regional variation and individual events dominating trend signatures in some regions (e.g., the 1970s prolonged Sahel drought and the 1930s drought in the USA and Canadian Prairies). Studies subsequent to these continue to provide somewhat different conclusions on trends in global droughts and/ or dryness since the middle of the 20th century (Sheffield et al., 2012; Dai, 2013; Donat et al., 2013c; van der Schrier et al., 2013)..

..In summary, the current assessment concludes that there is not enough evidence at present to suggest more than low confidence in a global-scale observed trend in drought or dryness (lack of rainfall) since the middle of the 20th century, owing to lack of direct observations, geographical inconsistencies in the trends, and dependencies of inferred trends on the index choice.

Based on updated studies, AR4 conclusions regarding global increasing trends in drought since the 1970s were probably overstated.

The paper by Dai is Drought under global warming: a review, A Dai, Climate Change (2011) – for some reason I am unable to access it.

A later paper in Nature, Trenberth et al 2013 (including both Sheffield and Dai as co-authors) said:

Two recent papers looked at the question of whether large-scale drought has been increasing under climate change. A study in Nature by Sheffield et al entitled ‘Little change in global drought over the past 60 years’ was published at almost the same time that ‘Increasing drought under global warming in observations and models’ by Dai appeared in Nature Climate Change (published online in August 2012). How can two research groups arrive at such seemingly contradictory conclusions?

Another later paper on droughts, Orlowski & Seneviratne 2013, likewise shows overwhelming evidence of more droughts – click to enlarge:

From Orlowsky & Seneviratne 2013

From Orlowsky & Seneviratne 2013

Figure 2 – Click to enlarge

Floods

SREX 2012, p. 177:

Overall, there is low confidence (due to limited evidence) that anthropogenic climate change has affected the magnitude and frequency of floods, though it has detectably influenced several components of the hydrological cycle, such as precipitation and snowmelt, that may impact flood trends. The assessment of causes behind the changes in floods is inherently complex and difficult.

AR5, Chapter 2, p. 214:

AR5 WGII assesses floods in regional detail accounting for the fact that trends in floods are strongly influenced by changes in river management (see also Section 2.5.2). Although the most evident flood trends appear to be in northern high latitudes, where observed warming trends have been largest, in some regions no evidence of a trend in extreme flooding has been found, for example, over Russia based on daily river discharge (Shiklomanov et al., 2007).

Other studies for Europe (Hannaford and Marsh, 2008; Renard et al., 2008; Petrow and Merz, 2009; Stahl et al., 2010) and Asia (Jiang et al., 2008; Delgado et al., 2010) show evidence for upward, downward or no trend in the magnitude and frequency of floods, so that there is currently no clear and widespread evidence for observed changes in flooding except for the earlier spring flow in snow-dominated regions (Seneviratne et al., 2012).

In summary, there continues to be a lack of evidence and thus low confidence regarding the sign or trend in the magnitude and/or frequency of floods on a global scale.

[Note: the text in the bottom line cited says: “..regarding the sign of trend in the magnitude..” which I assume is a typo, and so I changed of into or]

Storms

SREX, p. 159:

Detection of trends in tropical cyclone metrics such as frequency, intensity, and duration remains a significant challenge..

..Natural variability combined with uncertainties in the historical data makes it difficult to detect trends in tropical cyclone activity. There have been no significant trends observed in global tropical cyclone frequency records, including over the present 40-year period of satellite observations (e.g., Webster et al., 2005). Regional trends in tropical cyclone frequency have been identified in the North Atlantic, but the fidelity of these trends is debated (Holland and Webster, 2007; Landsea, 2007; Mann et al., 2007a). Different methods for estimating undercounts in the earlier part of the North Atlantic tropical cyclone record provide mixed conclusions (Chang and Guo, 2007; Mann et al., 2007b; Kunkel et al., 2008; Vecchi and Knutson, 2008).

Regional trends have not been detected in other oceans (Chan and Xu, 2009; Kubota and Chan, 2009; Callaghan and Power, 2011). It thus remains uncertain whether any observed increases in tropical cyclone frequency on time scales longer than about 40 years are robust, after accounting for past changes in observing capabilities (Knutson et al., 2010)..

..Time series of power dissipation, an aggregate compound of tropical cyclone frequency, duration, and intensity that measures total energy consumption by tropical cyclones, show upward trends in the North Atlantic and weaker upward trends in the western North Pacific over the past 25 years (Emanuel, 2007), but interpretation of longer-term trends in this quantity is again constrained by data quality concerns.

The variability and trend of power dissipation can be related to SST and other local factors such as tropopause temperature and vertical wind shear (Emanuel, 2007), but it is a current topic of debate whether local SST or the difference between local SST and mean tropical SST is the more physically relevant metric (Swanson, 2008).

The distinction is an important one when making projections of changes in power dissipation based on projections of SST changes, particularly in the tropical Atlantic where SST has been increasing more rapidly than in the tropics as a whole (Vecchi et al., 2008). Accumulated cyclone energy, which is an integrated metric analogous to power dissipation, has been declining globally since reaching a high point in 2005, and is presently at a 40- year low point (Maue, 2009). The present period of quiescence, as well as the period of heightened activity leading up to the high point in 2005, does not clearly represent substantial departures from past variability (Maue, 2009)..

..The present assessment regarding observed trends in tropical cyclone activity is essentially identical to the WMO assessment (Knutson et al., 2010): there is low confidence that any observed long-term (i.e., 40 years or more) increases in tropical cyclone activity are robust, after accounting for past changes in observing capabilities.

AR5, Chapter 2, p. 216:

AR4 concluded that it was likely that an increasing trend had occurred in intense tropical cyclone activity since 1970 in some regions but that there was no clear trend in the annual numbers of tropical cyclones. Subsequent assessments, including SREX and more recent literature indicate that it is difficult to draw firm conclusions with respect to the confidence levels associated with observed trends prior to the satellite era and in ocean basins outside of the North Atlantic.

Lots more tropical storms:

From AR5, wg I

From AR5, wg I

Figure 3

Note that a more important metric than “how many?” is “how severe?” or a combination of both.

And for extra-tropical storms (i.e. outside the tropics), SREX p. 166:

In summary it is likely that there has been a poleward shift in the main Northern and Southern Hemisphere extratropical storm tracks during the last 50 years. There is medium confidence in an anthropogenic influence on this observed poleward shift. It has not formally been attributed.

There is low confidence in past changes in regional intensity.

And AR5, chapter 2, p. 217 & 220:

Some studies show an increase in intensity and number of extreme Atlantic cyclones (Paciorek et al., 2002; Lehmann et al., 2011) while others show opposite trends in eastern Pacific and North America (Gulev et al., 2001). Comparisons between studies are hampered because of the sensitivities in identification schemes and/ or different definitions for extreme cyclones (Ulbrich et al., 2009; Neu et al., 2012). The fidelity of research findings also rests largely with the underlying reanalyses products that are used..

..In summary, confidence in large scale changes in the intensity of extreme extratropical cyclones since 1900 is low. There is also low confidence for a clear trend in storminess proxies over the last century due to inconsistencies between studies or lack of long-term data in some parts of the world (particularly in the SH). Likewise, confidence in trends in extreme winds is low, owing to quality and consistency issues with analysed data.

Discussion

The IPCC SREX and AR5 reports were published in 2012 and 2013 respectively. There will be new research published since these reports analyzing the same data and possibly reaching different conclusions. When you have large decadal variability in poorly observed data with a small or non-existent trend then inevitably different groups will be able to reach different conclusions on these trends. And if you focus on specific regions you can demonstrate a clear and unmistakeable trend.

If you are looking for a soundbite just pick the right region.

The last 100 years have seen global warming. As this blog has made clear from the physics, more GHGs (all other things remaining equal) result in more warming. What proportion of the last 100 years is intrinsic climate variability vs the anthropogenic GHG proportion I have no idea.

The last century has seen no clear globally averaged change in floods, droughts or storms – as best as we can tell with very incomplete observing systems. Of course, some regions have definitely seen more, and some regions have definitely seen less. Whether this is different from the period from 1800-1900 or from 1700-1800 no one knows. Perhaps floods, droughts and tropical storms increased globally from 1700-1900. Perhaps they decreased. Perhaps the last 100 years have seen more variability. Perhaps not. (And in recognition of Poe’s law, I note that a few statements within the article presenting graphs did say the opposite of the graphs presented).

Articles in this Series

Impacts – I – Introduction

Impacts – II – GHG Emissions Projections: SRES and RCP

Impacts – III – Population in 2100

Impacts – IV – Temperature Projections and Probabilities

References

SREX = Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation Special Report, IPCC (2012)

Observations: Atmosphere and Surface. Chapter 2 of Working Group I to AR5, DL Hartmann et al (2013)

Global Trends and Variability in Soil Moisture and Drought Characteristics, 1950–2000, from Observation-Driven Simulations of the Terrestrial Hydrologic Cycle, Justin Sheffield & Eric Wood, Journal of Climate (2008) – free paper

Global warming and changes in drought, Kevin E Trenberth et al, Nature (2013) – free paper

Elusive drought: uncertainty in observed trends and short- and long-term CMIP5 projections, B Orlowsky & SI Seneviratne, Hydrology and Earth System Sciences (2013) – free paper

In Impacts – II – GHG Emissions Projections: SRES and RCP we looked at projections of emissions under various scenarios with the resulting CO2 (and other GHG) concentrations and resulting radiative forcing.

Why do we need these scenarios? Because even if climate models were perfect and could accurately calculate the temperature 100 years from now, we wouldn’t know how much “anthropogenic CO2” (and other GHGs) would have been emitted by that time. The scenarios allow climate modelers to produce temperature (and other climate variable) projections on the basis of each of these scenarios.

The IPCC AR5 (fifth assessment report) from 2013 says (chapter 12, p. 1031):

Global mean temperatures will continue to rise over the 21st century if greenhouse gas (GHG) emissions continue unabated.

Under the assumptions of the concentration-driven RCPs, global mean surface temperatures for 2081–2100, relative to 1986–2005 will likely be in the 5 to 95% range of the CMIP5 models:

  • 0.3°C to 1.7°C (RCP2.6)
  • 1.1°C to 2.6°C (RCP4.5)
  • 1.4°C to 3.1°C (RCP6.0)
  • 2.6°C to 4.8°C (RCP8.5)

Global temperatures averaged over the period 2081– 2100 are projected to likely exceed 1.5°C above 1850-1900 for RCP4.5, RCP6.0 and RCP8.5 (high confidence), are likely to exceed 2°C above 1850-1900 for RCP6.0 and RCP8.5 (high confidence) and are more likely than not to exceed 2°C for RCP4.5 (medium confidence). Temperature change above 2°C under RCP2.6 is unlikely (medium confidence). Warming above 4°C by 2081–2100 is unlikely in all RCPs (high confidence) except for RCP8.5, where it is about as likely as not (medium confidence).

I commented in Part II that RCP8.5 seemed to be a scenario that didn’t match up with the last 40-50 years of development. Of course, the various scenario developers give their caveats, for example, Riahi et al 2007:

Given the large number of variables and their interdependencies, we are of the opinion that it is impossible to assign objective likelihoods or probabilities to emissions scenarios. We have also not attempted to assign any subjective likelihoods to the scenarios either. The purpose of the scenarios presented in this Special Issue is, rather, to span the range of uncertainty without an assessment of likely, preferable, or desirable future developments..

Readers should exercise their own judgment on the plausibility of above scenario ‘storylines’..

To me RCP6.0 seems a more likely future (compared with RCP8.5) in a world that doesn’t have any significant attempt to tackle CO2 emissions. That is, no major change in climate policy to today’s world, but similar economic and population development (note 1).

Here is the graph of projected temperature anomalies for the different scenarios. :

From AR5, chapter 12

From AR5, chapter 12

Figure 1

That graph is hard to make out for 2100, here is the table of corresponding data. I highlighted RCP6.0 in 2100 – you can click to enlarge the table:

ar5-ch12-table12-2-temperature-anomaly-2100-499px

Figure 2 – Click to expand

Probabilities and Lists

The table above has a “1 std deviation” and a 5%-95% distribution. The graph (which has the same source data) has shading to indicate 5%-95% of models for each RCP scenario.

These have no relation to real probability distributions. That is, the range of 5-95% for RCP6.0 doesn’t equate to: “the probability is 90% likely that the average temperature 2080-2100 will be 1.4-3.1ºC higher than the 1986-2005 average”.

A number of climate models are used to produce simulations and the results from these “ensembles” are sometimes pressed into “probability service”. For some concept background on ensembles read Ensemble Forecasting.

Here is IPCC AR5 chapter 12:

Ensembles like CMIP5 do not represent a systematically sampled family of models but rely on self-selection by the modelling groups.

This opportunistic nature of MMEs [multi-model ensembles] has been discussed, for example, in Tebaldi and Knutti (2007) and Knutti et al. (2010a). These ensembles are therefore not designed to explore uncertainty in a coordinated manner, and the range of their results cannot be straightforwardly interpreted as an exhaustive range of plausible outcomes, even if some studies have shown how they appear to behave as well calibrated probabilistic forecasts for some large-scale quantities. Other studies have argued instead that the tail of distributions is by construction undersampled.

In general, the difficulty in producing quantitative estimates of uncertainty based on multiple model output originates in their peculiarities as a statistical sample, neither random nor systematic, with possible dependencies among the members and of spurious nature, that is, often counting among their members models with different degrees of complexities (different number of processes explicitly represented or parameterized) even within the category of general circulation models..

..In summary, there does not exist at present a single agreed on and robust formal methodology to deliver uncertainty quantification estimates of future changes in all climate variables. As a consequence, in this chapter, statements using the calibrated uncertainty language are a result of the expert judgement of the authors, combining assessed literature results with an evaluation of models demonstrated ability (or lack thereof) in simulating the relevant processes (see Chapter 9) and model consensus (or lack thereof) over future projections. In some cases when a significant relation is detected between model performance and reliability of its future projections, some models (or a particular parametric configuration) may be excluded but in general it remains an open research question to find significant connections of this kind that justify some form of weighting across the ensemble of models and produce aggregated future projections that are significantly different from straightforward one model–one vote ensemble results. Therefore, most of the analyses performed for this chapter make use of all available models in the ensembles, with equal weight given to each of them unless otherwise stated.

And from one of the papers cited in that section of chapter 12, Jackson et al 2008:

In global climate models (GCMs), unresolved physical processes are included through simplified representations referred to as parameterizations.

Parameterizations typically contain one or more adjustable phenomenological parameters. Parameter values can be estimated directly from theory or observations or by “tuning” the models by comparing model simulations to the climate record. Because of the large number of parameters in comprehensive GCMs, a thorough tuning effort that includes interactions between multiple parameters can be very computationally expensive. Models may have compensating errors, where errors in one parameterization compensate for errors in other parameterizations to produce a realistic climate simulation (Wang 2007; Golaz et al. 2007; Min et al. 2007; Murphy et al. 2007).

The risk is that, when moving to a new climate regime (e.g., increased greenhouse gases), the errors may no longer compensate. This leads to uncertainty in climate change predictions. The known range of uncertainty of many parameters allows a wide variance of the resulting simulated climate (Murphy et al. 2004; Stainforth et al. 2005; M. Collins et al. 2006). The persistent scatter in the sensitivities of models from different modeling groups, despite the effort represented by the approximately four generations of modeling improvements, suggests that uncertainty in climate prediction may depend on underconstrained details and that we should not expect convergence anytime soon.

Stainforth et al 2005 (referenced in the quote above) tried much larger ensembles of coarser resolution climate models, and was discussed in the comments of Models, On – and Off – the Catwalk – Part Four – Tuning & the Magic Behind the Scenes. Rowlands et al 2012 is similar in approach and was discussed in Natural Variability and Chaos – Five – Why Should Observations match Models?

The way I read the IPCC reports and various papers is that clearly the projections are not a probability distribution. Then the data gets inevitably gets used as a de facto probability distribution.

Conclusion

“All models are wrong but some are useful” as George Box said, actually in a quite unrelated field (i.e., not climate). But it’s a good saying.

Many people who describe themselves as “lukewarmers” believe that climate sensitivity as characterized by the IPCC is too high and the real climate has a lower sensitivity. I have no idea.

Models may be wrong, but I don’t have an alternative model to provide. And therefore, given that they represent climate better than any current alternative, their results are useful.

We can’t currently create a real probability distribution from a set of temperature prediction results (assuming a given emissions scenario).

How useful is it to know that under a scenario like RCP6.0 the average global temperature increase in 2100 has been simulated as variously 1ºC, 2ºC, 3ºC, 4ºC? (note, I haven’t checked the CMIP5 simulations to get each value). And the tropics will vary less, land more? As we dig into more details we will attempt to look at how reliable regional and seasonal temperature anomalies might be compared with the overall number. Likewise rainfall and other important climate values.

I do find it useful to keep the idea of a set of possible numbers with no probability assigned. Then at some stage we can say something like, “if this RCP scenario turns out to be correct and the global average surface temperature actually increases by 3ºC by 2100, we know the following are reasonable assumptions … but we currently can’t make any predictions about these other values..

References

Long-term Climate Change: Projections, Commitments and Irreversibility, M Collins et al (2013) – In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change

Scenarios of long-term socio-economic and environmental development under climate stabilization, Keywan Riahi et al, Technological Forecasting & Social Change (2007) – free paper

Error Reduction and Convergence in Climate Prediction, Charles S Jackson et al, Journal of Climate (2008) – free paper

Notes

Note 1: As explored a little in the last article, RCP6.0 does include some changes to climate policy but it seems they are not major. I believe a very useful scenario for exploring impact assessments would be the population and development path of RCP6.0 (let’s call it RCP6.0A) without any climate policies.

For reasons of”scenario parsimony” this interesting pathway avoids attention.