In #1 we took a brief look at Natural Variation – climate varies from decade to decade, century to century. In #2 we took a brief look at attribution from “simple” models and from climate models (GCMs).
Here’s an example of the problem of “what do we make of climate models?”
I wrote about it on the original blog – Opinions and Perspectives – 6 – Climate Models, Consensus Myths and Fudge Factors. I noticed the paper I used in that article came up in Hourdin et al 2017, which in turn was referenced from the most recent IPCC report, AR6.
So this is the idea from the paper by Golaz and co-authors in 2013.
They ran a climate model over the 20th century – this is a standard thing to do to test a climate model on lots of different metrics. How well does the model reproduce our observations of trends?
In this case it was temperature change from 1900 to present.
In one version of the model they used a parameter value (related to aerosols and clouds) that is traditional but wrong, in another version they used the best value based on recent studies, and they added another alternate value.
What happens?
To see the whole article, visit the new Science of Doom on Substack page and please consider suscribing, for notifications on new articles.
Leave a Reply