In Part VI we looked at past and projected sea level rise. There is significant uncertainty in future sea level rise, even assuming we know the future global temperature change. The uncertainty results from “how much ice will melt?”

We can be reasonably sure of sea level rise from thermal expansion (so long as we know the temperature). By contrast, we don’t have much confidence in the contribution from melting ice (on land). This is because ice sheet dynamics (glaciers, Greenland & Antarctic ice sheet) are non-linear and not well understood.

Here’s something surprising. Suppose you live in Virginia near the ocean. And suppose all of the Greenland ice sheet melted in a few years (not possible, but just suppose). How much would sea level change in Virginia? Hint: the entire Greenland ice sheet converted into global mean sea level is about 7m.

Zero change in Virginia.

Here are charts of relative sea level change across the globe for Greenland & West Antarctica, based on a 1mm/yr contribution from each location – click to expand:

From Tamisiea 2011

From Tamisiea 2011

Figure 1 – Click to Expand

We see that the sea level actually drops close to Greenland, stays constant around mid-northern latitudes in the Atlantic and rises in other locations. The reason is simple – the Greenland ice sheet is a local gravitational attractor and is “pulling the ocean up” towards Greenland. Once it is removed, the local sea level drops.


If we knew for sure that the global mean temperature in 2100 would be +2ºC or +3ºC compared to today we would have a good idea in each case of the sea level rise from thermal expansion. But not much certainty on any rise from melting ice sheets.

Let’s consider someone thinking about the US for planning purposes. If the Greenland ice sheet contributes lots of melting ice, the sea level on the US Atlantic coast won’t be affected at all and the increase on the Pacific coast will be significantly less than the overall sea level rise. In this case, the big uncertainty in the magnitude of sea level rise is not much of a factor for most of the US.

If the West Antarctic ice sheet contributes lots of melting ice, the sea level on the east and west coasts of the US will be affected by more than the global mean sea level rise.

For example, imagine the sea level was expected to rise 0.3m from thermal expansion by 2100. But there is a fear that ice melting will cause 0 – 0.5m global rise. A US policymaker really needs to know which ice sheet will melt. The “we expect at most an additional 0.5m from melting ice” tells her that she might have – in total – a maximum sea level rise of 0.3m on the east coast and a little more than 0.3m on the west coast if Greenland melts; but she instead might have – in total – a maximum of almost 1m on each coast if West Antarctica melts.

The source of the melting ice just magnifies the uncertainty for policy and economics.

If this 10th century legend was still with us maybe it would be different (we only have his tweets):

Donaeld the Unready

Articles in this Series

Impacts – I – Introduction

Impacts – II – GHG Emissions Projections: SRES and RCP

Impacts – III – Population in 2100

Impacts – IV – Temperature Projections and Probabilities

Impacts – V – Climate change is already causing worsening storms, floods and droughts

Impacts – VI – Sea Level Rise 1


The moving boundaries of sea level change: Understanding the origins of geographic variability, ME Tamisiea & JX Mitrovica, Oceanography (2011)

In Part V we looked at the IPCC, an outlier organization, that claimed floods, droughts and storms had not changed in a measurable way globally in the last 50 -100 years (of course, some regions have seen increases and some have seen decreases, some decades have been bad, some decades have been good).

This puts them at a disadvantage compared with the overwhelming mass of NGOs, environmental groups, media outlets and various government departments who claim the opposite, but the contrarian in me found their research too interesting to ignore. Plus, they come with references to papers in respectable journals.

We haven’t looked at future projections of these events as yet. Whenever there are competing effects to create a result we can expect it to be difficult to calculate future effects. In contrast, one climate effect that we can be sure about is sea level. If the world warms, as it surely will with more GHGs, we can expect sea level to rise.

In my own mental list of “bad stuff to happen”, I had sea level rise as an obvious #1 or #2. But ideas and opinions need to be challenged and I had not really investigated the impacts.

The world is a big place and rising sea level will have different impacts in different places. Generally the media presentation on sea level is unrelentingly negative, probably following the impact of the impressive 2004 documentary directed by Roland Emmerich, and the dramatized adaption by Al Gore in 2006 (directed by Davis Guggenheim).

Let’s start by looking at some sea level basics.

Like everything else related to climate, getting an accurate global dataset on sea level is difficult – especially when we want consistency over decades.

The world is a big place and past climatological measurements were mostly focused on collecting local weather data for the country or region in question. Satellites started measuring climate globally in the late 1970s, but satellites for sea level and mass balance didn’t begin measurements until 10-20 years ago. So, climate scientists attempt to piece together disparate data systems, to reconcile them, and to match up the results with what climate models calculate – call it “a sea level budget”.

“The budget” means balancing two sides of the equation:

  • how has sea level changed year by year and decade by decade?
  • what contributions to sea level do we calculate from the effects of warming climate?

Components of Sea Level Rise

If we imagine sea level as the level in a large bathtub it is relatively simple conceptually. As the ocean warms the level rises for two reasons:

  • warmer water expands (increasing the volume of existing mass)
  • ice melts (adding mass)

The “material properties” of water are well known and not in doubt. With lots of measurements of ocean temperature around the globe we can be relatively sure of the expansion. Ocean temperature has increasing coverage over the last 100 years, especially since the Argo project that started a little more than 10 years ago. But if we go back 30 years we have a lot less measurements and usually only at the surface. If we go back 100 years we have less again. So there are questions and uncertainties.

Arctic ice melting has no impact on sea level because it is already floating. Water or ice that is already floating doesn’t change the sea level by melting/freezing. Ice on a continent that melts and runs into the ocean increases sea level due to increasing the mass. Some Antarctic ice shelves are in the ocean but are part of the Antarctic ice sheet that is supported by the continent of Antarctica – melt these ice sheets and they will add to ocean level.

Sea level over the last 100 years has increased by about 0.20m (about 8 inches, if we use advanced US units).

To put it into one perspective, 20,000 years ago the sea level was about 120m lower than today – this was the end of the last ice age. About 130,000 years ago the sea level was a few meters higher (no one is certain of the exact figure). This was the time of the last “interglacial” (called the Eemian interglacial).

If we melted all of Greenland’s ice sheet we would see a further 7m rise from today, and Greenland and Antarctica together would lead to a 70m rise. Over millennia (but not a century), the complete Greenland ice sheet melting is a possibility, but Antarctica is not (at around -30ºC, it is a very long way below freezing).


Why not use tide gauges to measure sea level rise? Some have been around for 100 years and a few have been around for 200 years.

There aren’t many tide gauges going back a long time, and anyway in many places the ground is moving relative to the ocean. Take Scandinavia. At the end of the last ice age Stockholm was buried under perhaps 2km of ice. No wonder Scandinavians today appear so cheerful – life is all about contrasts. As the ice melted, the load on the ground was removed and it is “springing back” into a pre-glacial position. So in many places around the globe the land is moving vertically relative to sea level.

In Nedre Gavle, Sweden, the land is moving up twice as fast as the average global sea level rise (so relative sea level is falling). In Galveston, Texas the land is moving down faster than sea level rise (more than doubling apparent sea level rise).

That is the first complication.

The second complication is due to wind and local density from salinity changes. So as an example, picture a constant sea level but Pacific winds change from W->E to E->W. The water will “pile up” in the west instead of the east, due to the force of the wind. Relative sea level will increase in the west and decrease in the east. Likewise, if the local density changes from melting ice (or ocean currents with different salinity) we can adjust the local sea level relative to the reference.

Here is AR5, chapter 3, p. 288:

Large-scale spatial patterns of sea level change are known to high precision only since 1993, when satellite altimetry became available.

These data have shown a persistent pattern of change since the early 1990s in the Pacific, with rates of rise in the Warm Pool of the western Pacific up to three times larger than those for GMSL, while rates over much of the eastern Pacific are near zero or negative.

The increasing sea level in the Warm Pool started shortly before the launch of TOPEX/Poseidon, and is caused by an intensification of the trade winds since the late 1980s that may be related to the Pacific Decadal Oscillation (PDO).

The lower rate of sea level rise since 1993 along the western coast of the United States has also been attributed to changes in the wind stress curl over the North Pacific associated with the PDO..

Measuring Systems

We can find a little about the new satellite systems in IPCC, AR5, chapter 3, p. 286:

Satellite radar altimeters in the 1970s and 1980s made the first nearly global observations of sea level, but these early measurements were highly uncertain and of short duration. The first precise record began with the launch of TOPEX/Poseidon (T/P) in 1992. This satellite and its successors (Jason-1, Jason-2) have provided continuous measurements of sea level variability at 10-day intervals between approximately ±66° latitude. Additional altimeters in different orbits (ERS-1, ERS-2, Envisat, Geosat Follow-on) have allowed for measurements up to ±82° latitude and at different temporal sampling (3 to 35 days), although these measurements are not as accurate as those from the T/P and Jason satellites.

Unlike tide gauges, altimetry measures sea level relative to a geodetic reference frame (classically a reference ellipsoid that coincides with the mean shape of the Earth, defined within a globally realized terrestrial reference frame) and thus will not be affected by VLM, although a small correction that depends on the area covered by the satellite (~0.3 mm yr–1) must be added to account for the change in location of the ocean bottom due to GIA relative to the reference frame of the satellite (Peltier, 2001; see also Section 13.1.2).

Tide gauges and satellite altimetry measure the combined effect of ocean warming and mass changes on ocean volume. Although variations in the density related to upper-ocean salinity changes cause regional changes in sea level, when globally averaged their effect on sea level rise is an order of magnitude or more smaller than thermal effects (Lowe and Gregory, 2006).

The thermal contribution to sea level can be calculated from in situ temperature measurements (Section 3.2). It has only been possible to directly measure the mass component of sea level since the launch of the Gravity Recovery and Climate Experiment (GRACE) in 2002 (Chambers et al., 2004). Before that, estimates were based either on estimates of glacier and ice sheet mass losses or using residuals between sea level measured by altimetry or tide gauges and estimates of the thermosteric component (e.g., Willis et al., 2004; Domingues et al., 2008), which allowed for the estimation of seasonal and interannual variations as well. GIA also causes a gravitational signal in GRACE data that must be removed in order to determine present-day mass changes; this correction is of the same order of magnitude as the expected trend and is still uncertain at the 30% level (Chambers et al., 2010).

The GRACE satellite lets us see how much ice has melted into the ocean. It’s not easy to calculate this otherwise.

The fourth assessment report from the IPCC in 2007 reported that sea level rise from the Antarctic ice sheet for the previous decade was between -0.3mm/yr and +0.5mm/yr. That is, without the new satellite measurements, it was very difficult to confirm whether Antarctica had been gaining or losing ice.

Historical Sea Level Rise

From AR5, chapter 3, p. 287:

From AR5, chapter 3

From AR5, chapter 3

Figure 1 – Click to expand

  • The top left graph shows that various researchers are fairly close in their calculations of overall sea level rise over the past 130 years
  • The bottom left graph shows that over the last 40 years the impact of melting ice has been more important than the expansion of a warmer ocean (“thermosteric component” = the effect of a warmer ocean expanding)
  • The bottom right graph shows that over the last 7 years the measurements are consistent – satellite measurement of sea level change matches the sum of mass loss (melting ice) plus an expanding ocean (the measurements from Argo turned into sea level rise).

This gives us the mean sea level. Remember that local winds, ocean currents and changes in salinity can change this trend locally.

Many people have written about the recent accelerating trends in sea level rise. Here is AR5 again, with a graph of the 18-year trend at each point in time. We can see that different researchers reach different conclusions and that the warming period in the first half of the 20th century created sea level rise comparable to today:

From AR5, chapter 3

From AR5, chapter 3

The conclusion in AR5:

It is virtually certain that globally averaged sea level has risen over the 20th century, with a very likely mean rate between 1900 and 2010 of 1.7 [1.5 to 1.9] mm/yr and 3.2 [2.8 and 3.6] mm/yr between 1993 and 2010.

This assessment is based on high agreement among multiple studies using different methods, and from independent observing systems (tide gauges and altimetry) since 1993.

It is likely that a rate comparable to that since 1993 occurred between 1920 and 1950, possibly due to a multi-decadal climate variation, as individual tide gauges around the world and all reconstructions of GMSL show increased rates of sea level rise during this period.

Forecast Future Sea Level Rise

AR5, chapter 13 is the place to find predictions of the future on sea level, p. 1140:

For the period 2081–2100, compared to 1986–2005, global mean sea level rise is likely (medium confidence) to be in the 5 to 95% range of projections from process-based models, which give:

  • 0.26 to 0.55 m for RCP2.6
  • 0.32 to 0.63 m for RCP4.5
  • 0.33 to 0.63 m for RCP6.0
  • 0.45 to 0.82 m for RCP8.5

For RCP8.5, the rise by 2100 is 0.52 to 0.98 m..

We have considered the evidence for higher projections and have concluded that there is currently insufficient evidence to evaluate the probability of specific levels above the assessed likely range. Based on current understanding, only the collapse of marine-based sectors of the Antarctic ice sheet, if initiated, could cause global mean sea level to rise substantially above the likely range during the 21st century.

This potential additional contribution cannot be precisely quantified but there is medium confidence that it would not exceed several tenths of a meter of sea level rise during the 21st century.

I highlighted RCP6.0 as this seems to correspond to past development pathways with little CO2 mitigation policies. No one knows the future, this is just my pick, barring major changes from the recent past.

In the next article we will consider impacts of future sea level rise in various regions.

Articles in this Series

Impacts – I – Introduction

Impacts – II – GHG Emissions Projections: SRES and RCP

Impacts – III – Population in 2100

Impacts – IV – Temperature Projections and Probabilities

Impacts – V – Climate change is already causing worsening storms, floods and droughts


Observations: Oceanic Climate Change and Sea Level. In: Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, NL Bindoff et al (2007)

Observations: Ocean. In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change, M Rhein et al (2013)

Sea Level Change. In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change, JA Church et al (2013)

I generally try and avoid the media as much as possible (although the 2016 Circus did suck me in) but it’s still impossible to miss claims like the following:

Climate change is already causing worsening storms, floods and droughts

Before looking at predictions for the future I thought it was worth reviewing this claim, seeing as it is so prevalent and is presented as being the current consensus of climate science.


SREX 2012, p. 171:

There is medium confidence that since the 1950s some regions of the world have experienced more intense and longer droughts (e.g., southern Europe, west Africa) but also opposite trends exist in other regions (e.g., central North America, northwestern Australia).

The report cites Sheffield and Wood 2008 who show graphs on a variety of drought metrics from around the world over the last 50 years – click to enlarge:

From Sheffield & Wood 2008

From Sheffield & Wood 2008

Figure 1 – Click to enlarge

The results above were calculated from models based on available meteorological data. According to their analysis some places have experienced more droughts, and other places less droughts. Because they are based on models we can expect that alternative researchers may produce different results.

AR5, published a year after SREX, says, chapter 2, p. 214-215:

Because drought is a complex variable and can at best be incompletely represented by commonly used drought indices, discrepancies in the interpretation of changes can result. For example, Sheffield and Wood (2008) found decreasing trends in the duration, intensity and severity of drought globally. Conversely, Dai (2011a,b) found a general global increase in drought, although with substantial regional variation and individual events dominating trend signatures in some regions (e.g., the 1970s prolonged Sahel drought and the 1930s drought in the USA and Canadian Prairies). Studies subsequent to these continue to provide somewhat different conclusions on trends in global droughts and/ or dryness since the middle of the 20th century (Sheffield et al., 2012; Dai, 2013; Donat et al., 2013c; van der Schrier et al., 2013)..

..In summary, the current assessment concludes that there is not enough evidence at present to suggest more than low confidence in a global-scale observed trend in drought or dryness (lack of rainfall) since the middle of the 20th century, owing to lack of direct observations, geographical inconsistencies in the trends, and dependencies of inferred trends on the index choice.

Based on updated studies, AR4 conclusions regarding global increasing trends in drought since the 1970s were probably overstated.

The paper by Dai is Drought under global warming: a review, A Dai, Climate Change (2011) – for some reason I am unable to access it.

A later paper in Nature, Trenberth et al 2013 (including both Sheffield and Dai as co-authors) said:

Two recent papers looked at the question of whether large-scale drought has been increasing under climate change. A study in Nature by Sheffield et al entitled ‘Little change in global drought over the past 60 years’ was published at almost the same time that ‘Increasing drought under global warming in observations and models’ by Dai appeared in Nature Climate Change (published online in August 2012). How can two research groups arrive at such seemingly contradictory conclusions?

Another later paper on droughts, Orlowski & Seneviratne 2013, likewise shows overwhelming evidence of more droughts – click to enlarge:

From Orlowsky & Seneviratne 2013

From Orlowsky & Seneviratne 2013

Figure 2 – Click to enlarge


SREX 2012, p. 177:

Overall, there is low confidence (due to limited evidence) that anthropogenic climate change has affected the magnitude and frequency of floods, though it has detectably influenced several components of the hydrological cycle, such as precipitation and snowmelt, that may impact flood trends. The assessment of causes behind the changes in floods is inherently complex and difficult.

AR5, Chapter 2, p. 214:

AR5 WGII assesses floods in regional detail accounting for the fact that trends in floods are strongly influenced by changes in river management (see also Section 2.5.2). Although the most evident flood trends appear to be in northern high latitudes, where observed warming trends have been largest, in some regions no evidence of a trend in extreme flooding has been found, for example, over Russia based on daily river discharge (Shiklomanov et al., 2007).

Other studies for Europe (Hannaford and Marsh, 2008; Renard et al., 2008; Petrow and Merz, 2009; Stahl et al., 2010) and Asia (Jiang et al., 2008; Delgado et al., 2010) show evidence for upward, downward or no trend in the magnitude and frequency of floods, so that there is currently no clear and widespread evidence for observed changes in flooding except for the earlier spring flow in snow-dominated regions (Seneviratne et al., 2012).

In summary, there continues to be a lack of evidence and thus low confidence regarding the sign or trend in the magnitude and/or frequency of floods on a global scale.

[Note: the text in the bottom line cited says: “..regarding the sign of trend in the magnitude..” which I assume is a typo, and so I changed of into or]


SREX, p. 159:

Detection of trends in tropical cyclone metrics such as frequency, intensity, and duration remains a significant challenge..

..Natural variability combined with uncertainties in the historical data makes it difficult to detect trends in tropical cyclone activity. There have been no significant trends observed in global tropical cyclone frequency records, including over the present 40-year period of satellite observations (e.g., Webster et al., 2005). Regional trends in tropical cyclone frequency have been identified in the North Atlantic, but the fidelity of these trends is debated (Holland and Webster, 2007; Landsea, 2007; Mann et al., 2007a). Different methods for estimating undercounts in the earlier part of the North Atlantic tropical cyclone record provide mixed conclusions (Chang and Guo, 2007; Mann et al., 2007b; Kunkel et al., 2008; Vecchi and Knutson, 2008).

Regional trends have not been detected in other oceans (Chan and Xu, 2009; Kubota and Chan, 2009; Callaghan and Power, 2011). It thus remains uncertain whether any observed increases in tropical cyclone frequency on time scales longer than about 40 years are robust, after accounting for past changes in observing capabilities (Knutson et al., 2010)..

..Time series of power dissipation, an aggregate compound of tropical cyclone frequency, duration, and intensity that measures total energy consumption by tropical cyclones, show upward trends in the North Atlantic and weaker upward trends in the western North Pacific over the past 25 years (Emanuel, 2007), but interpretation of longer-term trends in this quantity is again constrained by data quality concerns.

The variability and trend of power dissipation can be related to SST and other local factors such as tropopause temperature and vertical wind shear (Emanuel, 2007), but it is a current topic of debate whether local SST or the difference between local SST and mean tropical SST is the more physically relevant metric (Swanson, 2008).

The distinction is an important one when making projections of changes in power dissipation based on projections of SST changes, particularly in the tropical Atlantic where SST has been increasing more rapidly than in the tropics as a whole (Vecchi et al., 2008). Accumulated cyclone energy, which is an integrated metric analogous to power dissipation, has been declining globally since reaching a high point in 2005, and is presently at a 40- year low point (Maue, 2009). The present period of quiescence, as well as the period of heightened activity leading up to the high point in 2005, does not clearly represent substantial departures from past variability (Maue, 2009)..

..The present assessment regarding observed trends in tropical cyclone activity is essentially identical to the WMO assessment (Knutson et al., 2010): there is low confidence that any observed long-term (i.e., 40 years or more) increases in tropical cyclone activity are robust, after accounting for past changes in observing capabilities.

AR5, Chapter 2, p. 216:

AR4 concluded that it was likely that an increasing trend had occurred in intense tropical cyclone activity since 1970 in some regions but that there was no clear trend in the annual numbers of tropical cyclones. Subsequent assessments, including SREX and more recent literature indicate that it is difficult to draw firm conclusions with respect to the confidence levels associated with observed trends prior to the satellite era and in ocean basins outside of the North Atlantic.

Lots more tropical storms:

From AR5, wg I

From AR5, wg I

Figure 3

Note that a more important metric than “how many?” is “how severe?” or a combination of both.

And for extra-tropical storms (i.e. outside the tropics), SREX p. 166:

In summary it is likely that there has been a poleward shift in the main Northern and Southern Hemisphere extratropical storm tracks during the last 50 years. There is medium confidence in an anthropogenic influence on this observed poleward shift. It has not formally been attributed.

There is low confidence in past changes in regional intensity.

And AR5, chapter 2, p. 217 & 220:

Some studies show an increase in intensity and number of extreme Atlantic cyclones (Paciorek et al., 2002; Lehmann et al., 2011) while others show opposite trends in eastern Pacific and North America (Gulev et al., 2001). Comparisons between studies are hampered because of the sensitivities in identification schemes and/ or different definitions for extreme cyclones (Ulbrich et al., 2009; Neu et al., 2012). The fidelity of research findings also rests largely with the underlying reanalyses products that are used..

..In summary, confidence in large scale changes in the intensity of extreme extratropical cyclones since 1900 is low. There is also low confidence for a clear trend in storminess proxies over the last century due to inconsistencies between studies or lack of long-term data in some parts of the world (particularly in the SH). Likewise, confidence in trends in extreme winds is low, owing to quality and consistency issues with analysed data.


The IPCC SREX and AR5 reports were published in 2012 and 2013 respectively. There will be new research published since these reports analyzing the same data and possibly reaching different conclusions. When you have large decadal variability in poorly observed data with a small or non-existent trend then inevitably different groups will be able to reach different conclusions on these trends. And if you focus on specific regions you can demonstrate a clear and unmistakeable trend.

If you are looking for a soundbite just pick the right region.

The last 100 years have seen global warming. As this blog has made clear from the physics, more GHGs (all other things remaining equal) result in more warming. What proportion of the last 100 years is intrinsic climate variability vs the anthropogenic GHG proportion I have no idea.

The last century has seen no clear globally averaged change in floods, droughts or storms – as best as we can tell with very incomplete observing systems. Of course, some regions have definitely seen more, and some regions have definitely seen less. Whether this is different from the period from 1800-1900 or from 1700-1800 no one knows. Perhaps floods, droughts and tropical storms increased globally from 1700-1900. Perhaps they decreased. Perhaps the last 100 years have seen more variability. Perhaps not. (And in recognition of Poe’s law, I note that a few statements within the article presenting graphs did say the opposite of the graphs presented).

Articles in this Series

Impacts – I – Introduction

Impacts – II – GHG Emissions Projections: SRES and RCP

Impacts – III – Population in 2100

Impacts – IV – Temperature Projections and Probabilities


SREX = Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation Special Report, IPCC (2012)

Observations: Atmosphere and Surface. Chapter 2 of Working Group I to AR5, DL Hartmann et al (2013)

Global Trends and Variability in Soil Moisture and Drought Characteristics, 1950–2000, from Observation-Driven Simulations of the Terrestrial Hydrologic Cycle, Justin Sheffield & Eric Wood, Journal of Climate (2008) – free paper

Global warming and changes in drought, Kevin E Trenberth et al, Nature (2013) – free paper

Elusive drought: uncertainty in observed trends and short- and long-term CMIP5 projections, B Orlowsky & SI Seneviratne, Hydrology and Earth System Sciences (2013) – free paper

In Impacts – II – GHG Emissions Projections: SRES and RCP we looked at projections of emissions under various scenarios with the resulting CO2 (and other GHG) concentrations and resulting radiative forcing.

Why do we need these scenarios? Because even if climate models were perfect and could accurately calculate the temperature 100 years from now, we wouldn’t know how much “anthropogenic CO2” (and other GHGs) would have been emitted by that time. The scenarios allow climate modelers to produce temperature (and other climate variable) projections on the basis of each of these scenarios.

The IPCC AR5 (fifth assessment report) from 2013 says (chapter 12, p. 1031):

Global mean temperatures will continue to rise over the 21st century if greenhouse gas (GHG) emissions continue unabated.

Under the assumptions of the concentration-driven RCPs, global mean surface temperatures for 2081–2100, relative to 1986–2005 will likely be in the 5 to 95% range of the CMIP5 models:

  • 0.3°C to 1.7°C (RCP2.6)
  • 1.1°C to 2.6°C (RCP4.5)
  • 1.4°C to 3.1°C (RCP6.0)
  • 2.6°C to 4.8°C (RCP8.5)

Global temperatures averaged over the period 2081– 2100 are projected to likely exceed 1.5°C above 1850-1900 for RCP4.5, RCP6.0 and RCP8.5 (high confidence), are likely to exceed 2°C above 1850-1900 for RCP6.0 and RCP8.5 (high confidence) and are more likely than not to exceed 2°C for RCP4.5 (medium confidence). Temperature change above 2°C under RCP2.6 is unlikely (medium confidence). Warming above 4°C by 2081–2100 is unlikely in all RCPs (high confidence) except for RCP8.5, where it is about as likely as not (medium confidence).

I commented in Part II that RCP8.5 seemed to be a scenario that didn’t match up with the last 40-50 years of development. Of course, the various scenario developers give their caveats, for example, Riahi et al 2007:

Given the large number of variables and their interdependencies, we are of the opinion that it is impossible to assign objective likelihoods or probabilities to emissions scenarios. We have also not attempted to assign any subjective likelihoods to the scenarios either. The purpose of the scenarios presented in this Special Issue is, rather, to span the range of uncertainty without an assessment of likely, preferable, or desirable future developments..

Readers should exercise their own judgment on the plausibility of above scenario ‘storylines’..

To me RCP6.0 seems a more likely future (compared with RCP8.5) in a world that doesn’t have any significant attempt to tackle CO2 emissions. That is, no major change in climate policy to today’s world, but similar economic and population development (note 1).

Here is the graph of projected temperature anomalies for the different scenarios. :

From AR5, chapter 12

From AR5, chapter 12

Figure 1

That graph is hard to make out for 2100, here is the table of corresponding data. I highlighted RCP6.0 in 2100 – you can click to enlarge the table:


Figure 2 – Click to expand

Probabilities and Lists

The table above has a “1 std deviation” and a 5%-95% distribution. The graph (which has the same source data) has shading to indicate 5%-95% of models for each RCP scenario.

These have no relation to real probability distributions. That is, the range of 5-95% for RCP6.0 doesn’t equate to: “the probability is 90% likely that the average temperature 2080-2100 will be 1.4-3.1ºC higher than the 1986-2005 average”.

A number of climate models are used to produce simulations and the results from these “ensembles” are sometimes pressed into “probability service”. For some concept background on ensembles read Ensemble Forecasting.

Here is IPCC AR5 chapter 12:

Ensembles like CMIP5 do not represent a systematically sampled family of models but rely on self-selection by the modelling groups.

This opportunistic nature of MMEs [multi-model ensembles] has been discussed, for example, in Tebaldi and Knutti (2007) and Knutti et al. (2010a). These ensembles are therefore not designed to explore uncertainty in a coordinated manner, and the range of their results cannot be straightforwardly interpreted as an exhaustive range of plausible outcomes, even if some studies have shown how they appear to behave as well calibrated probabilistic forecasts for some large-scale quantities. Other studies have argued instead that the tail of distributions is by construction undersampled.

In general, the difficulty in producing quantitative estimates of uncertainty based on multiple model output originates in their peculiarities as a statistical sample, neither random nor systematic, with possible dependencies among the members and of spurious nature, that is, often counting among their members models with different degrees of complexities (different number of processes explicitly represented or parameterized) even within the category of general circulation models..

..In summary, there does not exist at present a single agreed on and robust formal methodology to deliver uncertainty quantification estimates of future changes in all climate variables. As a consequence, in this chapter, statements using the calibrated uncertainty language are a result of the expert judgement of the authors, combining assessed literature results with an evaluation of models demonstrated ability (or lack thereof) in simulating the relevant processes (see Chapter 9) and model consensus (or lack thereof) over future projections. In some cases when a significant relation is detected between model performance and reliability of its future projections, some models (or a particular parametric configuration) may be excluded but in general it remains an open research question to find significant connections of this kind that justify some form of weighting across the ensemble of models and produce aggregated future projections that are significantly different from straightforward one model–one vote ensemble results. Therefore, most of the analyses performed for this chapter make use of all available models in the ensembles, with equal weight given to each of them unless otherwise stated.

And from one of the papers cited in that section of chapter 12, Jackson et al 2008:

In global climate models (GCMs), unresolved physical processes are included through simplified representations referred to as parameterizations.

Parameterizations typically contain one or more adjustable phenomenological parameters. Parameter values can be estimated directly from theory or observations or by “tuning” the models by comparing model simulations to the climate record. Because of the large number of parameters in comprehensive GCMs, a thorough tuning effort that includes interactions between multiple parameters can be very computationally expensive. Models may have compensating errors, where errors in one parameterization compensate for errors in other parameterizations to produce a realistic climate simulation (Wang 2007; Golaz et al. 2007; Min et al. 2007; Murphy et al. 2007).

The risk is that, when moving to a new climate regime (e.g., increased greenhouse gases), the errors may no longer compensate. This leads to uncertainty in climate change predictions. The known range of uncertainty of many parameters allows a wide variance of the resulting simulated climate (Murphy et al. 2004; Stainforth et al. 2005; M. Collins et al. 2006). The persistent scatter in the sensitivities of models from different modeling groups, despite the effort represented by the approximately four generations of modeling improvements, suggests that uncertainty in climate prediction may depend on underconstrained details and that we should not expect convergence anytime soon.

Stainforth et al 2005 (referenced in the quote above) tried much larger ensembles of coarser resolution climate models, and was discussed in the comments of Models, On – and Off – the Catwalk – Part Four – Tuning & the Magic Behind the Scenes. Rowlands et al 2012 is similar in approach and was discussed in Natural Variability and Chaos – Five – Why Should Observations match Models?

The way I read the IPCC reports and various papers is that clearly the projections are not a probability distribution. Then the data gets inevitably gets used as a de facto probability distribution.


“All models are wrong but some are useful” as George Box said, actually in a quite unrelated field (i.e., not climate). But it’s a good saying.

Many people who describe themselves as “lukewarmers” believe that climate sensitivity as characterized by the IPCC is too high and the real climate has a lower sensitivity. I have no idea.

Models may be wrong, but I don’t have an alternative model to provide. And therefore, given that they represent climate better than any current alternative, their results are useful.

We can’t currently create a real probability distribution from a set of temperature prediction results (assuming a given emissions scenario).

How useful is it to know that under a scenario like RCP6.0 the average global temperature increase in 2100 has been simulated as variously 1ºC, 2ºC, 3ºC, 4ºC? (note, I haven’t checked the CMIP5 simulations to get each value). And the tropics will vary less, land more? As we dig into more details we will attempt to look at how reliable regional and seasonal temperature anomalies might be compared with the overall number. Likewise rainfall and other important climate values.

I do find it useful to keep the idea of a set of possible numbers with no probability assigned. Then at some stage we can say something like, “if this RCP scenario turns out to be correct and the global average surface temperature actually increases by 3ºC by 2100, we know the following are reasonable assumptions … but we currently can’t make any predictions about these other values..


Long-term Climate Change: Projections, Commitments and Irreversibility, M Collins et al (2013) – In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change

Scenarios of long-term socio-economic and environmental development under climate stabilization, Keywan Riahi et al, Technological Forecasting & Social Change (2007) – free paper

Error Reduction and Convergence in Climate Prediction, Charles S Jackson et al, Journal of Climate (2008) – free paper


Note 1: As explored a little in the last article, RCP6.0 does include some changes to climate policy but it seems they are not major. I believe a very useful scenario for exploring impact assessments would be the population and development path of RCP6.0 (let’s call it RCP6.0A) without any climate policies.

For reasons of”scenario parsimony” this interesting pathway avoids attention.

In Part II we looked at various scenarios for emissions. One important determinant is how the world population will change through this century and with a few comments on that topic I thought it worth digging a little.

Here is Lutz, Sanderson & Scherbov, Nature (2001):

The median value of our projections reaches a peak around 2070 at 9.0 billion people and then slowly decreases. In 2100, the median value of our projections is 8.4 billion people with the 80 per cent prediction interval bounded by 5.6 and 12.1 billion.

From Lutz 2001

From Lutz 2001

Figure 1 – Click to enlarge

This paper is behind a paywall but Lutz references the 1996 book he edited for assumptions, which is freely available (link below).

In it the authors comment, p. 22:

Some users clearly want population figures for the year 2100 and beyond. Should the demographer disappoint such expectations and leave it to others with less expertise to produce them? The answer given in this study is no. But as discussed below, we make a clear distinction between what we call projections up to 2030-2050 and everything beyond that time, which we term extensions for illustrative purposes.

[Emphasis added]

And then p.32:

Sanderson (1995) shows that it is impossible to produce “objective” confidence ranges for future population projections. Subjective confidence intervals are the best we can ever attain because assumptions are always involved.

Here are some more recent views.

Gerland et al 2014 – Gerland is from the Population Division of the UN:

The United Nations recently released population projections based on data until 2012 and a Bayesian probabilistic methodology. Analysis of these data reveals that, contrary to previous literature, world population is unlikely to stop growing this century. There is an 80% probability that world population, now 7.2 billion, will increase to between 9.6 and 12.3 billion in 2100. This uncertainty is much smaller than the range from the traditional UN high and low variants. Much of the increase is expected to happen in Africa, in part due to higher fertility and a recent slowdown in the pace of fertility decline..

..Among the most robust empirical findings in the literature on fertility transitions are that higher contraceptive use and higher female education are associated with faster fertility decline. These suggest that the projected rapid population growth could be moderated by greater investments in family planning programs to satisfy the unmet need for contraception, and in girls’ education. It should be noted, however, that the UN projections are based on an implicit assumption of a continuation of existing policies, but an intensification of current investments would be required for faster changes to occur

Wolfgang Lutz & Samir KC (2010). Lutz seems popular in this field:

The total size of the world population is likely to increase from its current 7 billion to 8–10 billion by 2050. This uncertainty is because of unknown future fertility and mortality trends in different parts of the world. But the young age structure of the population and the fact that in much of Africa and Western Asia, fertility is still very high makes an increase by at least one more billion almost certain. Virtually, all the increase will happen in the developing world. For the second half of the century, population stabilization and the onset of a decline are likely..

Although the paper doesn’t focus on 2100, but only up to 2050 it does include a graph for probalistic expectations to 2100 and has some interesting commentary around how different forecasting groups deal with uncertainty, how women’s education plays a huge role in reducing fertility and many other stories, for example:

The Demographic and Health Survey for Ethiopia, for instance, shows that women without any formal education have on average six children, whereas those with secondary education have only two (see http://www.measuredhs.com). Significant differentials can be found in most populations of all cultural traditions. Only in a few modern societies does the strongly negative association give way to a U-shaped pattern in which the most educated women have a somewhat higher fertility than those with intermediate education. But globally, the education differentials are so pervasive that education may well be called the single most important observable source of population heterogeneity after age and sex (Lutz et al. 1999). There are good reasons to assume that during the course of a demographic transition the fact that higher education leads to lower fertility is a true causal mechanism, where education facilitates better access to and information about family planning and most importantly leads to a change in attitude in which ‘quantity’ of children is replaced by ‘quality’, i.e. couples want to have fewer children with better life chances..

Lee 2011, another very interesting paper, makes this comment:

The U.N. projections assume that fertility will slowly converge toward replacement level (2.1 births per woman) by 2100

Lutz’s book had a similar hint that many demographers assume that somehow societies on mass will converge towards a steady state. Lee also comments that probability treatments for “low”, “medium” and “high” are not very realistic because the methods used assume a correlation between different countries, that isn’t true in practice. Lutz likewise has similar points. Here is Lee:

Special issues arise in constructing consistent probability intervals for individual countries, for regions, and for the world, because account must be taken of the positive or negative correlations among the country forecast errors within regions and across regions. Since error correlation is typically positive but less than 1.0, country errors tend to cancel under aggregation, and the proportional error bounds for the world population are far narrower than for individual countries. The NRC study (20) found that the average absolute country error was 21% while the average global error was only 3%. When the High, Medium and Low scenario approach is used, there is no cancellation of error under aggregation, so the probability coverage at different levels of aggregation cannot be handled consistently. An ongoing research collaboration between the U.N. Population Division and a team led by Raftery is developing new and very promising statistical methods for handling uncertainty in future forecasts.

And then on UN projections:

One might quibble with this or that assumption, but the UN projections have had an impressive record of success in the past, particularly at the global level, and I expect that to continue in the future. To a remarkable degree, the UN has sought out expert advice and experimented with cutting edge forecasting techniques, while maintaining consistency in projections. But in forecasting, errors are inevitable, and sound decision making requires that the likelihood of errors be taken into account. In this respect, there is much room for improvement in the UN projections and indeed in all projections by government statistical offices.

This comment looks like an oblique academic gentle slapping around (disguised as praise), but it’s hard to tell.


I don’t have a conclusion. I thought it would be interesting to find some demographic experts and show their views on future population trends. The future is always hard to predict – although in demography the next 20 years are usually easy to predict, short of global plagues and famines.

It does seem hard to have much idea about the population in 2100, but the difference between a population of 8bn and 11bn will have a large impact on CO2 emissions (without very significant CO2 mitigation policies).


The end of world population growth, Wolfgang Lutz, Warren Sanderson & Sergei Scherbov, Nature (2001) – paywall paper

The future population of the world – what can we assume?, edited Wolfgang Lutz, Earthscan Publications (1996) – freely available book

World Population Stabilization Unlikely This Century, Patrick Gerland et al, Science (2014) – free paper

Dimensions of global population projections: what do we know about future population trends and structures? Wolfgang Lutz & Samir KC, Phil. Trans. R. Soc. B (2010)

The Outlook for Population Growth, Ronald Lee, Science (2011) – free paper

In one of the iconic climate model tests, CO2 is doubled from a pre-industrial level of 280ppm to 560ppm “overnight” and we find the new steady state surface temperature. The change in CO2 is an input to the climate model, also known as a “forcing” because it is from outside. That is, humans create more CO2 from generating electricity, driving automobiles and other activities – this affects the climate and the climate responds.

These experiments with simple climate models were first done with 1d radiative-convective models in the 1960s. For example, Manabe & Wetherald 1967 who found a 2.3ºC surface temperature increase with constant relative humidity and 1.3ºC with constant absolute humidity (and for many reasons constant relative humidity seems more likely to be closer to reality than constant absolute humidity).

In other experiments, especially more recently, more more complex GCMs simulate 100 years with the CO2 concentration being gradually increased, in line with projections about future emissions – and we see what happens to temperature with time.

There are also other GHGs (“greenhouse” gases / radiatively-active gases) in the atmosphere that are changing due to human activity – especially methane (CH4) and nitrous oxide (N2O). And of course, the most important GHG is water vapor, but changes in water vapor concentration are a climate feedback – that is, changes in water vapor result from temperature (and circulation) changes.

And there are aerosols, some internally generated within the climate and others emitted by human activity. These also affect the climate in a number of ways.

We don’t know what future anthropogenic emissions will be. What will humans do? Build lots more coal-fire power stations to meet energy demand of the future? Run the entire world’s power grid from wind and solar by 2040? Finally invent practical nuclear fusion? How many people will there be?

So for this we need some scenarios of future human activity (note 1).

Scenarios – SRES and RCP

SRES was published in 2000:

In response to a 1994 evaluation of the earlier IPCC IS92 emissions scenarios, the 1996 Plenary of the IPCC requested this Special Report on Emissions Scenarios (SRES) (see Appendix I for the Terms of Reference). This report was accepted by the Working Group III (WGIII) plenary session in March 2000. The long-term nature and uncertainty of climate change and its driving forces require scenarios that extend to the end of the 21st century. This Report describes the new scenarios and how they were developed.

The SRES scenarios cover a wide range of the main driving forces of future emissions, from demographic to technological and economic developments. As required by the Terms of Reference, none of the scenarios in the set includes any future policies that explicitly address climate change, although all scenarios necessarily encompass various policies of other types.

The set of SRES emissions scenarios is based on an extensive assessment of the literature, six alternative modeling approaches, and an “open process” that solicited wide participation and feedback from many groups and individuals. The SRES scenarios include the range of emissions of all relevant species of greenhouse gases (GHGs) and sulfur and their driving forces..

..A set of scenarios was developed to represent the range of driving forces and emissions in the scenario literature so as to reflect current understanding and knowledge about underlying uncertainties. They exclude only outlying “surprise” or “disaster” scenarios in the literature. Any scenario necessarily includes subjective elements and is open to various interpretations. Preferences for the scenarios presented here vary among users. No judgment is offered in this Report as to the preference for any of the scenarios and they are not assigned probabilities of occurrence, neither must they be interpreted as policy recommendations..

..By 2100 the world will have changed in ways that are difficult to imagine – as difficult as it would have been at the end of the 19th century to imagine the changes of the 100 years since. Each storyline assumes a distinctly different direction for future developments, such that the four storylines differ in increasingly irreversible ways. Together they describe divergent futures that encompass a significant portion of the underlying uncertainties in the main driving forces. They cover a wide range of key “future” characteristics such as demographic change, economic development, and technological change. For this reason, their plausibility or feasibility should not be considered solely on the basis of an extrapolation of current economic, technological, and social trends.

The RCPs were in part a new version of the same idea as SRES and published in 2011. My understanding is that the Representative Concentration Pathways worked more towards final values of radiative forcing in 2100 that were considered in the modeling literature, and you can see this in the names of each RCP.

from A special issue on the RCPs, van Vuuren et al (2011)

By design, the RCPs, as a set, cover the range of radiative forcing levels examined in the open literature and contain relevant information for climate model runs.

[Emphasis added]

From The representative concentration pathways: an overview, van Vuuren et al (2011)

This paper summarizes the development process and main characteristics of the Representative Concentration Pathways (RCPs), a set of four new pathways developed for the climate modeling community as a basis for long-term and near-term modeling experiments.

The four RCPs together span the range of year 2100 radiative forcing values found in the open literature, i.e. from 2.6 to 8.5 W/m². The RCPs are the product of an innovative collaboration between integrated assessment modelers, climate modelers, terrestrial ecosystem modelers and emission inventory experts. The resulting product forms a comprehensive data set with high spatial and sectoral resolutions for the period extending to 2100..

..The RCPs are named according to radiative forcing target level for 2100. The radiative forcing estimates are based on the forcing of greenhouse gases and other forcing agents. The four selected RCPs were considered to be representative of the literature, and included one mitigation scenario leading to a very low forcing level (RCP2.6), two medium stabilization scenarios (RCP4.5/RCP6) and one very high baseline emission scenarios (RCP8.5).

Here are some graphs from the RCP introduction paper:

Population and GDP scenarios:


Figure 1 – Click to expand

I was surprised by the population graph for RCP 8.5 and 6 (similar scenarios are generated in SRES). From reading various sources (but not diving into any detailed literature) I understood that the consensus was for population to peak mid-century at around 9bn people and then reduce back to something like 7-8bn people by the end of the century. This is because all countries that have experienced rising incomes have significantly reduced average fertility rates.

Here is Angus Deaton, in his fascinating and accessible book for people interested in The Great Escape as he calls it (that is, our escape from poverty and poor health):

In Africa in 1950, each woman could expect to give birth to 6.6 children; by 2000, that number had fallen to 5.1, and the UN estimates that it is 4.4 today. In Asia as well as in Latin America and the Caribbean, the decline has been even larger, from 6 children to just over 2..

The annual rate of growth of the world’s population, which reached 2.2% in 1960, was only half of that in 2011.

The GDP graph on the right (above) is lacking a definition. From the other papers covering the scenarios I understand it to be total world GDP in US$ trillions (at 2000 values, i.e. adjusted for inflation), although the numbers don’t seem to align exactly.

Energy consumption for the different scenarios:

Figure 2 – Click to expand

Annual emissions:

Figure 3 – Click to expand

Resulting concentrations in the atmosphere for CO2, CH4 (methane) and N2O (nitrous oxide):


Figure 4 – Click to expand

Radiative forcing (for explanation of this term, see for example Wonderland, Radiative Forcing and the Rate of Inflation):


Figure 5  – Click to expand

We can see from this figure (fig 5, their fig 10) that the RCP numbers refer to the expected radiative forcing in 2100 – so RCP8.5, often known as the “business as usual” scenario has a radiative forcing, compared to pre-industrial values, of 8.5 W/m². And RCP6 has a radiative forcing in 2100, of 6 W/m².

We can also see from the figure on the right that increases in CO2 are the cause of almost all of most of the increase from current values. For example, only RCP8.5 has a higher methane (CH4) forcing than today.

Business as usual – RCP 8.5 or RCP 6?

I’ve seen RCP8.5 described as “business as usual” but it seems quite an unlikely scenario. Perhaps we need to dive into this scenario more in another article. In the meantime, part of the description from Riahi et al (2011):

The scenario’s storyline describes a heterogeneous world with continuously increasing global population, resulting in a global population of 12 billion by 2100. Per capita income growth is slow and both internationally as well as regionally there is only little convergence between high and low income countries. Global GDP reaches around 250 trillion US2005$ in 2100.

The slow economic development also implies little progress in terms of efficiency. Combined with the high population growth, this leads to high energy demands. Still, international trade in energy and technology is limited and overall rates of technological progress is modest. The inherent emphasis on greater self-sufficiency of individual countries and regions assumed in the scenario implies a reliance on domestically available resources. Resource availability is not necessarily a constraint but easily accessible conventional oil and gas become relatively scarce in comparison to more difficult to harvest unconventional fuels like tar sands or oil shale.

Given the overall slow rate of technological improvements in low-carbon technologies, the future energy system moves toward coal-intensive technology choices with high GHG emissions. Environmental concerns in the A2 world are locally strong, especially in high and medium income regions. Food security is also a major concern, especially in low-income regions and agricultural productivity increases to feed a steadily increasing population.

Compared to the broader integrated assessment literature, the RCP8.5 represents thus a scenario with high global population and intermediate development in terms of total GDP (Fig. 4).

Per capita income, however, stays at comparatively low levels of about 20,000 US $2005 in the long term (2100), which is considerably below the median of the scenario literature. Another important characteristic of the RCP8.5 scenario is its relatively slow improvement in primary energy intensity of 0.5% per year over the course of the century. This trend reflects the storyline assumption of slow technological change. Energy intensity improvement rates are thus well below historical average (about 1% per year between 1940 and 2000). Compared to the scenario literature RCP8.5 depicts thus a relatively conservative business as usual case with low income, high population and high energy demand due to only modest improvements in energy intensity.

When I heard the term “business as usual” I’m sure I wasn’t alone in understanding it like this: the world carries on without adopting serious CO2 limiting policies. That is, no international agreements on CO2 reductions, no carbon pricing, etc. And the world continues on its current trajectory of growth and development. When you look at the last 40 years, it has been quite amazing. Why would growth slow, population not follow the pathway it has followed in all countries that have seen rising prosperity, and why would technological innovation and adoption slow? It would be interesting to see a “business as usual” scenario for emissions, CO2 concentrations and radiative forcing that had a better fit to the name.

RCP 6 seems to be a closer fit than RCP 8.5 to the name “business as usual”.

RCP6 is a climate-policy intervention scenario. That is, without explicit policies designed to reduce emissions, radiative forcing would exceed 6.0 W/m² in the year 2100.

However, the degree of GHG emissions mitigation required over the period 2010 to 2060 is small, particularly compared to RCP4.5 and RCP2.6, but also compared to emissions mitigation requirement subsequent to 2060 in RCP6 (Van Vuuren et al., 2011). The IPCC Fourth Assessment Report classified stabilization scenarios into six categories as shown in Table 1. RCP6 scenario falls into the border between the fifth category and the sixth category.

Its global mean long-term, steady-state equilibrium temperature could be expected to rise 4.9° centigrade, assuming a climate sensitivity of 3.0 and its CO2 equivalent concentration could be 855 ppm (Metz et al. 2007).

Some of the background to RCP 8.5 assumptions is in an earlier paper also by the same lead author – Riahi et al 2007, another freely accessible paper (reference below) which is worth a read, for example:

The task ahead of anticipating the possible developments over a time frame as ‘ridiculously’ long as a century is wrought with difficulties. Particularly, readers of this Journal will have sympathy for the difficulties in trying to capture social and technological changes over such a long time frame. One wonders how Arrhenius’ scenario of the world in 1996 would have looked, perhaps filled with just more of the same of his time—geopolitically, socially, and technologically. Would he have considered that 100 years later:

  • backward and colonially exploited China would be in the process of surpassing the UK’s economic output, eventually even that of all of Europe or the USA?
  • the existence of a highly productive economy within a social welfare state in his home country Sweden would elevate the rural and urban poor to unimaginable levels of personal affluence, consumption, and free time?
  • the complete obsolescence of the dominant technology cluster of the day-coal-fired steam engines?

How he would have factored in the possibility of the emergence of new technologies, especially in view of Lord Kelvin’s sobering ‘conclusion’ of 1895 that “heavier-than-air flying machines are impossible”?

Note on Comments

The Etiquette and About this Blog both explain the commenting policy in this blog. I noted briefly in the Introduction that of course questions about 100 years from now mean some small relaxation of the policy. But, in a large number of previous articles, we have discussed the “greenhouse” effect (just about to death) and so people who question it are welcome to find a relevant article and comment there – for example, The “Greenhouse” Effect Explained in Simple Terms which has many links to related articles. Questions on climate sensitivity, natural variation, and likelihood of projected future temperatures due to emissions are, of course, all still fair game in this series.

But I’ll just delete comments that question the existence of the greenhouse effect. Draconian, no doubt.


Emissions Scenarios, IPCC (2000) – free report

A special issue on the RCPs, Detlef P van Vuuren et al, Climatic Change (2011) – free paper

The representative concentration pathways: an overview, Detlef P van Vuuren et al, Climatic Change (2011) – free paper

RCP4.5: a pathway for stabilization of radiative forcing by 2100, Allison M. Thomson et al, Climatic Change (2011) – free paper

An emission pathway for stabilization at 6 Wm−2 radiative forcing,  Toshihiko Masui et al, Climatic Change (2011) – free paper

RCP 8.5—A scenario of comparatively high greenhouse gas emissions, Keywan Riahi et al, Climatic Change (2011) – free paper

Scenarios of long-term socio-economic and environmental development under climate stabilization, Keywan Riahi et al, Technological Forecasting & Social Change (2007) – free paper

Thermal equilibrium of the atmosphere with a given distribution of relative humidity, S Manabe, RT Wetherald, Journal of the Atmospheric Sciences (1967) – free paper

The Great Escape, Health, Wealth and the Origins of Inequality, Angus Deaton, Princeton University Press (2013) – book


Note 1: Even if we knew future anthropogenic emissions accurately it wouldn’t give us the whole picture. The climate has sources and sinks for CO2 and methane and there is some uncertainty about them, especially how well they will operate in the future. That is, anthropogenic emissions are modified by the feedback of sources and sinks for these emissions.

A long time ago, in About this Blog I wrote:

Opinions are often interesting and sometimes entertaining. But what do we learn from opinions? It’s more useful to understand the science behind the subject. What is this particular theory built on? How long has the theory been “established”? What lines of evidence support this theory? What evidence would falsify this theory? What do opposing theories say?

Now I would like to look at impacts of climate change. And so opinions and value judgements are inevitable.

In physics we can say something like “95% of radiation at 667 cm-1 is absorbed within 1m at the surface because of the absorption properties of CO2″ and be judged true or false. It’s a number. It’s an equation. And therefore the result is falsifiable – the essence of science. Perhaps in some cases all the data is not in, or the formula is not yet clear, but this can be noted and accepted. There is evidence in favor or against, or a mix of evidence.

As we build equations into complex climate models, judgements become unavoidable. For example, “convection is modeled as a sub-grid parameterization therefore..”. Where the conclusion following “therefore” is the judgement. We could call it an opinion. We could call it an expert opinion. We could call it science if the result is falsifiable. But it starts to get a bit more “blurry” – at some point we move from a region of settled science to a region of less-settled science.

And once we consider the impacts in 2100 it seems that certainty and falsifiability must be abandoned. “Blurry” is the best case.


Less than a year ago listening to America and the New Global Economy by Timothy Taylor (via audible.com) I remember he said something like “the economic cost of climate change was all lumped into a fat tail – if the temperature change was on the higher side”. Sorry for my inaccurate memory (and the downside of audible.com vs a real book). Well it sparked my interest in another part of the climate journey.

I’ve been reading IPCC Working Group II (wgII) – some of the “TAR” (= third assessment report) from 2001 for background and AR5, the latest IPCC report from 2014. Some of the impacts also show up in Working Group I which is about the physical climate science, and the IPCC Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation from 2012, known as SREX (Special Report on Extremes). These are all available at the IPCC website.

The first chapter of the TAR, Working Group II says:

The world community faces many risks from climate change. Clearly it is important to understand the nature of those risks, where natural and human systems are likely to be most vulnerable, and what may be achieved by adaptive responses. To understand better the potential impacts and associated dangers of global climate change, Working Group II of the Intergovernmental Panel on Climate Change (IPCC) offers this Third Assessment Report (TAR) on the state of knowledge concerning the sensitivity, adaptability, and vulnerability of physical, ecological, and social systems to climate change.

A couple of common complaints in the blogosphere that I’ve noticed are:

  • “all the impacts are supposed to be negative but there are a lot of positives from warming”
  • “CO2 will increase plant growth so we’ll be better off”

Within the field of papers and IPCC reports it’s clear that CO2 increasing plant growth is not ignored. Likewise, there are expected to be winners and losers (often, but definitely not exclusively, geographically distributed), even though the IPCC summarizes the expected overall effect as negative.

Of course, there is a highly entertaining field of “recycled press releases about the imminent catastrophe of climate change” which I’m sure ignores any positives or tradeoffs. Even in what could charitably be called “respected media outlets” there seem to be few correspondents with basic scientific literacy. Not even the ability to add up the numbers on an electricity bill or distinguish between the press release of a company planning to get wonderful results in 2025 vs today’s reality.

Anyway, entertaining as it is to shoot fish in a barrel, we will try to stay away from discussing newsotainment and stay with the scientific literature and IPCC assessments. Inevitably, we’ll stray a little.

I haven’t tried to do a comprehensive summary of the issues believed to impact humanity, but here are some:

  • sea level rise
  • heatwaves
  • droughts
  • floods
  • more powerful cyclones and storms
  • food production
  • ocean acidification
  • extinction of animal and plant species
  • more pests (added, thanks Tom, corrected thanks DeWitt)
  • disease (added, thanks Tom)

Possibly I’ve missed some.

Covering the subject is not easy but it’s an interesting field.