I have done a partial update of the Roadmap section – creating a few sub-pages and listed the relevant articles under the sub-pages.
It is a work in progress, the idea is to make it possible for new visitors to find useful articles. Most blogs have a high bias towards the last few articles.
I have split off:
CO2 – an 8-part series on CO2 as well as a few other related articles
Science Roads Less Traveled – science basics and alternative theories explained
“Back Radiation” – the often misunderstood subject of radiation emitted by the atmosphere
Just a note as well for new visitors. There are many articles explaining some climate science basics. Many people assume from this – and from other simplistic coverage on the internet – that climate science is full of over-simplistic models.
I don’t want to encompass all in a sweeping generalization.. but.. almost all comments I see on this subject are attacking simplistic models aimed at educating rather than models actually used in climate science.
For example, models aiming to give simple education on the radiative effect of CO2 range from:
- ultra-simplistic/misleading – CO2 works like a “greenhouse”
- simplistic – CO2 is an “insulator” trapping heat
- basic radiative model – blackbody radiator of the surface, atmosphere & solar combination
But in a real climate model, there are equations from fundamental physics like:
And in atmospheric radiation textbooks:
Providing a set of equations doesn’t prove anything is right.
But my intent is to highlight that simple models are for illumination. It is easy to prove that simple models are simplistic.
The science of atmospheres and climate is much more sophisticated than these models designed for illumination.
I’m a mere generalist: hopelessly bad in terms of understanding even the simplest equations, but brilliant in terms of putting their essence into an Engish which can be readily understood by any well-educated person.
Are there any blogs that cater to the likes of me?
“…simple models are for illumination.”
Indeed!
Couldn’t agree more 🙂
Once the bulb is lit, then the person needs to move to the next level and add more complexity (if they are up to it).
@Hunt Janin,
you might like to try
http://www.skepticalscience.com/Newcomers-Start-Here.html
The articles there are usually short and to the point.
They even have a list of arguments each containing only one sentence
http://www.skepticalscience.com/argument.php
The “Science Roads less Traveled” link is not working.
Thanks, now fixed.
Where are the fancy equations for the physics and properties of clouds which are amorphous and move about at the whim of the wind.
In particular what equations are used to calculate the amount of CO2 contained in the water droplets and how it exchanges with free CO2 in the air and how much CO2 is removed from the air in rain drops?
Climate cannot be modeled with any useful skill and accuracy because in real air the is no unifrom distribution of mass of the atmospheric gases and of water vapor as shown by weather maps of the earth. High pressure cells have more mass and less water vapor than do low pressure cells.
Real air is the term for local air at the intake ports of air separation plants and contains particles of all types, reactive gases, volatile organic componds, water vapor, CO2 and the fixed gases (i.e., nitrogen, oxygen and the inert gases).
Satellite images show that there is no unifrom distribution of clouds. Clouds are trouble makers and are the climate scientist’s worst nightmare.
These fancy radiative transfer equations only apply to squeaky clean virtual air of uniform distribution that exits in the imagination of the physicists.
Harold:
Amorphous clouds.
If the cloud is minimal, its impact on the radiation balance of a grid sized 100×100 km^2 is, obviously, minimal. If it covers a significant part of a grid cell, a fraction of the radiative transfer is explicitly computed in models similar to the ones presented by Steve Carson in the links above. Example: Thomas and Stamnes 1999: Radiative Transfer in the Atmosphere and Ocean, Cambridge University Press, page 468, for instance.
CO2 contained in water droplets.
Sure the dilution of CO2 in rain affects the concentration of CO2 on a microscopic scale close to the cloud droplets. However, the observed concentration of CO2 (see data from Mauna Loa and similar observatories) do NOT show any significant dependence of the measured values with rainfall. This is an effect that has not to be considered at all on a climate time scale in terms of the radiative transfer equation, since CO2 concentration is prescribed.
Mass and water vapor.
Climate and weather forecast models explicitly consider mass (pressure) and water vapor as prognostic variables, together with temperature and other ones such as wind, density and so on. The spatial anisotropy of mass, temperature, water vapor mixing ratio are explicitly considered by the models.
Mixture of gases
Any first years’ undergraduate student on atmospheric physics (Wallace and Hobbs is excellent) already explains that air is a mixture of gases. The equations consider all of the most important ones. VOC and many other gases’ concentrations are not considered, obviously, their concentrations are minimal and the impact on the value of the gas constant that must be used in the equation of state is completely meaningless.
Clouds
Clouds are difficult, it is certain, but some of them are simulated (imperfectly but simulated) and other ones’ radiative effects are also considered. Other ones are not, but nobody says models are perfect. Work is currently in progress to improve this. But it is to some extent being considered right now.
Fancy radiative transfer equations.
They apply, with some modifications, including clouds if needed, to any window of the atmosphere, given appropriate data on constitution, vertical profile of temperature or constituents and so on. Examples are usually presented on the easiest cases (no clouds), but real computations exist for the complex situations too. This is similar with the way physics is taught. During early years movements are analyzed for free particles. Until this is not mastered, it has no sense to consider friction. The same applies here. If people don’t understand the core of radiative transfer for clear-sky conditions, which is the point in using examples considering clouds?
Hope this helps
ATTN: SOD
Thank you for not whacking my comment as Joe, John and Gavin always do.
SOD: When you present a simplified model, you owe your reader a brief explanation of: 1) The limitations of the model, particularly where it is misleading or incorrect. 2) The existence of more complete models or a full description of the physics.
Rather than mess with the integrated forms of radiation transfer equations (which typically have no analytical solution), the differential form or Schwartzchild equation is much simpler and contains a complete description of the physics. You introduced me to this physics. To save you some work and review what I learned, hopefully correctly:
The intensity of light (I) leaving a thin layer of atmosphere at a certain wavelength depends on the energy entering that layer, the amount of light absorbed by that layer and the amount of light emitted at that wavelength by that layer (and in some situations we need to include the amount of light scattered or reflected by the layer and clouds).
dI = -absorption + emission
dI = -[kprI]ds + [B(T)kpr]ds
The change in light intensity (dI) at a given wavelength as it passes through a short distance of air (ds) is the absorption minus the emission. Absorption is proportional to:
a) the amount of light entering the layer (I),
b) the density of the molecules absorbing light = atmosphere density (p or rho) times the mixing ratio or weight fraction of the absorber (r, rising from 300 to 400 ppm for CO2),
c) an absorption coefficient (k, absorption per unit mass) that varies with wavelength (often dramatically and rapidly). Since nitrogen, oxygen and argon don’t have a dipole moment, their absorption coefficient is zero at most visible and infrared wavelengths. We often restrict our discussion to the major absorbers, mainly water vapor and CO2. Since they absorb a greater fraction of outgoing (infrared) than incoming (mostly visible), these molecules are called GHGs.
Emission is proportional to:
a) the density of GHG molecules (pr) emitting light (the same pr factor as for absorption),
b) an emission coefficient (k) which is identical to the absorption coefficient,
c) the Boltzmann function (B(T)) which tells us how the probability of emitting of the a photon varies with temperature and wavelength.
Absorption in our atmosphere doesn’t change appreciably with T, but emission varies with T^4 or about 3X because of B(T).
Integrated forms of the Schwartzschild equation are extremely complicated and usually include:
a) a term, tau, which I think represents the distance 1/e of the light is lost (net absorption minus emission?).
b) a geometric factor (1/u?) which corrects for the fact that we want to know how about energy flow towards and away from the earth, while B(T) describes light radiating in all directions from a point source.
c) Integrating over all wavelengths to determine energy flux.
SOD’s August post on “Vanishing Nets” presented an integrated solution to the Schwartzschild equation assuming equilibrium between incoming and outgoing energy, a fixed value of k for all outgoing wavelengths, and a constant mixing ratio at all altitudes. This is called the grey atmosphere model.
Absorption varies with wavelength and the concentration of water vapor varies with altitude, so there is no simple equation that describes energy flux through the Earth’s atmosphere. Climate models calculate energy flux through the earth’s atmosphere by numerical integration of the Schwartzschild equation – breaking the atmosphere up into grid cells and calculating the change in light intensity over all wavelengths in each grid cell assuming an average value for the parameters in each cell.
@Hunt Janin
A couple of points from my own experience:
(a) Even tho’ you can make progress without following all the equations, there’s a limit to how far you can get with purely qualitative understanding. I think you will still need to try to understand what some of the numbers mean.
(b) Most climate blogs (this one excepted, which is why it is so valuable) have an openly biassed standpoint, usually but not always either broadly sceptical or broadly alarmist. This divergence not surprisingly colours the way the topic is usually introduced and discussed. Without necessarily being untruthful, a site may, for instance, avoid discussion of serious objections to its point of view.
And, given that the subject is of immense complexity, I suspect there are few if any individual scientists who can truthfully claim to have a deep understanding of it right across the board. Ideally the IPCC should have been able to fill this gap. However its ‘Summaries for policmakers’ and ‘Technical summaries’ 2001 and 2007 (available free online at http://www.ipcc.ch) seem to me suffer from the same tendency to bias as do most climate blogs. It’s more what they leave out than the errors that they contain.
Until about 4 years ago I knew little about climate (and nothing about the global warming debate) but I had a bit of spare time and wanted to learn. Scienceofdoom wasn’t here then.
I spent time at the outset reading parts of the IPCC reports and educating myself from two websites at nearly opposite ends of the spectrum: the alarmist site http://www.realclimate.org and the sceptic site http://www.junkscience.com/greenhouse/index I found useful material on both these sites, mixed up in both cases with possibly more dubious statements and maybe not-so-well-founded assertions. However I reckoned that where they agreed (and I found substantial agreement) I could at least take the common ground as a starting point.
I also read a number of books….
I am now beginning to appreciate how little is really known about climate. I’m hoping scienceofdoom will enable us to sketch in some of the boundaries of that knowledge.
HR: Balance is generally a good thing, but to equate realclimate.org with junkscience.com is going too far. On one hand, you have a site operated by scientists who are active researchers on climate and who are (yes) alarmed about anthropogenic climate change because they believe it represents a significant threat to our future (and especially to our children’s future). Does being alarmed about something automatically make you an ‘alarmist’? On the other hand you have a site with links to tobacco-company funded denial of the dangers of second-hand smoke (following on from earlier efforts to deny that smoking causes cancer by demanding ever-more conclusive evidence, and to deny that smoking is addictive– something about which tobacco executives explicitly lied to Congress, as documents later revealed). Climate researchers don’t have a high profile at junkscience.com– but political and corporate interests do. Real skepticism involves a serious inquiry that actually starts from evidence and cares about a balanced approach to that evidence. —–ism [moderator’s note, please check the etiquette] (which is the right label for what goes on at junkscience.com) is all about selective skepticism and cherry-picking– for them, no evidence is ever good enough, if it supports AGW– but no evidence (short-term ‘trends’, local cold snaps, snow in winter) is every weak enough, if it seems to count against AGW. If you really want to be a skeptic, you need to recalibrate your approach.
Bryson Brown:
Thank you for your comment. The point I am making is that given the scarcity of climate blogs which don’t have an axe to grind, it can be a good idea to cast one’s net widely so as to identify common ground. I mentioned two (out of many possible) online sources which I had found useful for a newcomer looking for an introduction to the basics of climate science. What those sources, and their authors, have or have not said or written (rightly or wrongly) in other contexts and on other topics doesn’t diminish their usefulness for that particular purpose.
hr– Thanks for your response, too. I’m sympathetic to what you’re trying to do, but I think it’s also important to recognize that having a position is not the same as having ‘an axe to grind’. The distinction isn’t always easy to draw, but if you follow the back and forth of arguments and replies it’s often clear that one side is changing the subject or ignoring the responses of the other side. Over time you can also develop a critical sense of what a good argument (of various kinds) looks like– for example, when plotting year-to-year data with a substantial variance, selecting a short period of time with a high ‘bump’ at the outset and a low point at the end is an easy way to ‘show’ that there’s a downward trend. But it’s not an honest way to deal with the data: any underlying trend can only be found if you deal with long enough periods that the trend itself is substantially larger than the variance. And when a pattern of bad practice is clear, then ‘axe grinding’ becomes a reasonable label to use. I’m persuaded (as a philosopher/historian of science) that the work presented on realclimate.org really is honest support for their position; with junkscience.com, not so much. Not that (many) of them aren’t perfectly sincere, but their work doesn’t look good when examined this way.
HR and BB: A more rigorous way to deal with trends where the beginning and endpoints appear to have been cherry-picked (such as the 1998 El Nino) is to only trust sources that tell you the slope and the 95% confidence interval around the slope. If one takes statistical uncertainty into account, there is nothing innately wrong with wanting to know the temperature trend for the past 10 or 12 years (little or no increase) and comparing it to the trend for 1975-1995 (+0.2? degC/decade) when the scientific “consensus” about climate sensitivity was coalescing. It would be even better to know the 95% confidence interval about the difference or have some other statistical measure concerning the significance of the difference. Unfortunately, temperature data shows significant auto-correlation – if this month’s temperature anomaly is above average (due to El Nino), next month’s will be also. From a statistical point of view, there aren’t 120 months of independent temperature data per decade, so simple statistical analysis results in 95% confidence intervals that somewhat too small.
Or perhaps confidence intervals that are a lot too small. Simple auto-correlation may not be a good model for the behavior of temperature. If there’s long term persistence, then a better model would be ARFIMA, autoregressive, fractionally integrated moving average. That drops degrees of freedom even more. That can also mean that longer time series do not produce better estimates of the climate as the apparent standard deviation can actually increase with an increasing number of samples as the low frequency noise makes itself felt. 1/f noise or drift is the bane of instrumental analytical chemistry.
At some point searching the entire space of possible statistical models becomes a bit pointless– finding a trend is pretty hard if any form of mere chance hypothesis is available as an alternative. But having a credible causal hypothesis and some real physical constraints in hand makes a big difference, epistemically speaking.