Feeds:
Posts
Comments

Archive for the ‘Climate History’ Category

In #1 we saw an example of natural variability in floods in Europe over 500 years. Clearly the large ups and downs prior to the 1900s can’t be explained by “climate change”, i.e. from burning fossil fuels.

If you learnt about climate change via the media then you’ve probably heard very little about natural variability, but it’s at the top of climate scientists’ minds when they look at the past, even if it doesn’t get mentioned much in press releases.

Here’s another example, this time of droughts in the western USA. This is a reconstruction of the pre-instrument period.

To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.

Read Full Post »

Originally, I thought we would have a brief look at the subject of attribution before we went back to the IPCC 6th Assessment Report (AR6). However, it’s a big subject.

In #8, and the few articles preceding, we saw various attempts to characterize “natural variability” from the few records we have. It’s a challenge. I recommend reading the conclusion of #8.

In this article we’ll look at a paper by G J van Oldenborgh and colleagues from 2013. They introduce the concept of assessing natural variability using climate models, but that’s not the principle idea of the paper. However, it’s interesting to see what they say.

Their basic idea – we can compare weather models against reality because we make repeated weather forecasts and then can see whether we were overconfident or underconfident.

For example, one time we said there was a 10% chance of a severe storm. The storm didn’t happen. That doesn’t mean we were wrong. It was a probability. But if we have 100 examples of this 10% chance we can see – did we get approximately 10 instances of severe storms? If we got 0-3 maybe we were wildly overconfident. If we got 30 maybe we were very underconfident.

Now we can’t compare climate models outputs of the future vs observations because the future hasn’t happened yet – there’s only one planet and climate forecasts are over decades to a century, not one week.

We can, however, compare the spatial variation of models with reality.

To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.

Read Full Post »

In #7 we looked at Huybers & Curry 2006 and Pelletier 1998 and saw “power law relationships” when we look at past climate variation over longer timescales.

Pelletier also wrote a very similar paper in 1997 that I went through, and in searching for who cited it I came across “The Structure of Climate Variability Across Scales”, a review paper from Christian Franzke and co-authors from 2020:

To summarize, many climatological time series exhibit a power law behavior in their amplitudes or their autocorrelations or both. This behavior is an imprint of scaling, which is a fundamental property of many physical and biological systems and has also been discovered in financial and socioeconomic data as well as in information networks. While the power law has no preferred scale, the exponential function, also ubiquitous in physical and biological systems, does have a preferred scale, namely, the e-folding scale, that is, the amount by which its magnitude has decayed by a factor of e. For example, the average height of humans is a good predictor for the height of the next person you meet as there are no humans that are 10 times larger or smaller than you.

However, the average wealth of people is not a good predictor for the wealth of the next person you meet as there are people who can be more than a 1,000 times richer or poorer than you are. Hence, the height of people is well described by a Gaussian distribution, while the wealth of people follows a power law.

To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.

Read Full Post »

In #6 we looked in a bit more detail at Imbers and co-authors from 2014. Natural variability is a big topic.

In this article we’ll look at papers that try to assess natural variability over long timescales – Peter Huybers & William Curry from 2006 who also cited an interesting paper from Jon Pelletier from 1998.

Here’s Jon Pelletier:

Understanding more about the natural variability of climate is essential for an accurate assessment of the human influence on climate. For example, an accurate model of natural variability would enable climatologists to make quantitative estimates of the likelihood that the observed warming trend is anthropogenically induced.

He notes another paper with this comment (explained in simpler terms below):

However, their stochastic model for the natural variability of climate was an autoregressive model which had an exponential autocorrelation dependence on time lag. We present evidence for a power-law autocorrelation function, implying larger low-frequency fluctuations than those produced by an autoregressive stochastic model. This evidence suggests that the statistical likelihood of the observed warming trend being larger than that expected from natural variations of the climate system must be reexamined.

In plain language, the paper he refers to used the simplest model of random noise with persistence, the AR(1) model we looked at in the last article.

He is saying that this simple model is “too kind” when trying to weigh up anthropogenic vs natural variations in temperature.

To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.

Read Full Post »

In #1 we took a brief look at Natural Variation – climate varies from decade to decade, century to century. In #2 we took a brief look at attribution from “simple” models and from climate models (GCMs).

Here’s an example of the problem of “what do we make of climate models?”

I wrote about it on the original blog – Opinions and Perspectives – 6 – Climate Models, Consensus Myths and Fudge Factors. I noticed the paper I used in that article came up in Hourdin et al 2017, which in turn was referenced from the most recent IPCC report, AR6.

So this is the idea from the paper by Golaz and co-authors in 2013.

They ran a climate model over the 20th century – this is a standard thing to do to test a climate model on lots of different metrics. How well does the model reproduce our observations of trends?

In this case it was temperature change from 1900 to present.

In one version of the model they used a parameter value (related to aerosols and clouds) that is traditional but wrong, in another version they used the best value based on recent studies, and they added another alternate value.

What happens?

To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.

Read Full Post »

In #1 we looked at some examples of natural variability – the climate changes from decade to decade, century to century and out to much longer timescales.

How sure are we that any recent changes are from burning fossil fuels, or other human activity?

In some scientific fields we can run controlled experiments but we just have the one planet. So instead we need to use our knowledge of physics.

In an attempt to avoid a lengthy article I’m going to massively over-simplify.

“Simple Physics”

Some concepts in climate can be modeled by what I’ll call “simple physics”. It often doesn’t look simple.

Let’s take adding CO2 to the atmosphere. We can do this in a mathematical model. If we “keep everything else the same” in a given location we can calculate the change in energy the planet emits to space for more CO2. Less energy is emitted to space with more CO2 in the atmosphere.

The value varies in different locations, but we just calculate it in lots of places and take the average.

As less energy is leaving the planet (but the same amount is still being absorbed by the sun) the planet warms up.

In our model, we can keep increasing the temperature of the planet in our model until the energy emitted to space is back to what it was before. The planetary energy budget is back in balance.

So we’ve calculated a new surface temperature for, say, a doubling of CO2.

To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.

Read Full Post »

In the last set of articles we’ve looked at past trends in extreme weather, following the flow of chapter 11 from the 6th assessment report of the IPCC.

How do we know the cause of any changes?

In recent years in most of the media everything that changes is “climate change” which is implicitly or explicitly equated with burning fossil fuels, i.e., adding CO2 into the atmosphere. It’s a genius catchphrase from a marketing point of view, not so helpful for scientific understanding.

I used to prefer the term “anthropogenic global warming” but it has its flaws as well, as some recent trends are believed to be anthropogenic but not from adding CO2 into the atmosphere. An example is changes that result from reduced aerosols in the atmosphere as a result of burning less biomass.

I’ll generally try and stay with “anthropogenic” or “from more CO2”, but there’s no copy editor, so let’s see.

Lots of changes in past climate metrics are simply natural variability. Understanding and quantifying natural variability is a big topic and our knowledge is always going to be imperfect.

For example, there were multi-decadal megadroughts in North America and Europe in the past 1000 years. They were probably “unprecendented” for their time, but clearly weren’t caused by burning fossil fuels.

Here is a reconstruction of the drought index over 1000 years of western North America..

To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.

Read Full Post »

Global temperature has been rising since around 1900, and CO2 is the principal cause. The physics behind the inappropriately-named “greenhouse effect” is certain, so burning fossil fuels, which adds CO2 to the atmosphere, is certain to increase the surface temperature. I’ve written many articles on that topic on the original blog and shown how the equations are derived (see Notes).

So it should be no surprise to find that there are more extreme hot days and less extreme cold days.

If the temperature goes up, then the number of days with a temperature above say 35°C (86°F) or 40°C (95°F) – or whatever number you want to pick – will increase.

Likewise, the number of days with a temperature below say -10°C (14°F) or -20°C (-4°F) – again, pick a number for your region – will decrease.

Here’s a graph of global land and ocean temperature, extracted from a larger graph in chapter 2 of AR6 (see the Notes below for the full map of changes):

To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.

Read Full Post »

In #10-#13 we looked at recent trends in droughts.

After covering Tropical Cyclones in #1-#6 it was worth doing a summary in part because AR6’s summary missed some good news from the report itself.

In #7-#9 we looked at extreme rainfall and floods and the summary was important because AR6’s summary missed some good news.

I’ve included the full text from p. 1575 below in the notes.

If you check the table in the notes section of #11 we can see that there were 8 regions identified with a rainfall deficit and 6 regions identified with a rainfall increase. These are also listed in the report section before the summary. However, the summary section only says:

There is medium confidence in increases in precipitation deficits in a few regions of Africa and South America.

Of course, people can read the section before and find out that there were places with a rainfall increase (a decrease in droughts). But anyone limiting themselves to only the “summary” would miss out on this good news.

On soil moisture droughts – agricultural and ecological droughts – they say:

To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.

Read Full Post »

In #12 we saw that soil moisture droughts – agricultural and ecological droughts – have increased globally.

I’ve been following the flow of AR6 in their discussion of recent trends. They do go on to discuss hydrological droughts without much that’s definitive so perhaps we’ll have a brief look at that in another article, as I’m something of a completionist.

But there’s something important missing from the drought section.

Are plants dying? If not, is there really an increase in soil moisture drought?

Here’s a question from Alexis Berg & Justin Sheffield (2018) to put the problem in a broader context. Here and in all the other papers quoted, bold text is my change:

The notion that a warmer climate leads to a drier land surface, i.e., increased water stress, driven overwhelming by the effect of warmer temperatures on evaporative demand, appears, however, inconsistent with paleo-evidence and vegetation reconstructions for different colder and warmer past climates.

To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.

Read Full Post »

Older Posts »