In the The Amazing Case of “Back-Radiation” series, which included Part Two and Part Three, someone commented that it would have been good to see more than a few days of DLR (downward longwave radiation, aka “back radiation”) data. There were some monthly summaries from a number of locations, but the BSRN (baseline surface radiation network) data that I selected and plotted was quite limited.
At the time I was using Excel to load up the data and with values recorded every minute it wasn’t easy to plot more than a week of data. Armed with some new tools, here’s the data from Darwin, Australia from 2003 from the BSRN network:
Click on the image for a larger view
The mean = 409 W/m² and the standard deviation = 27 W/m². (I don’t know what happened in July, I expect it is more likely to be instrument / data collection issues than the DLR taking vacation for the month).
Here is the expanded data on January through to June. The vertical axis is the same for each for easier comparison. Click on any of the graphs below to get a larger view.
January:
February:
March:
April:
May:
June:
The atmosphere cools down a lot slower than the land, which is why the difference between DLR for day and night is generally quite small. The way to think about any “body” heating or cooling is to consider two factors:
- its specific heat capacity (how much heat is needed to lift 1kg of that substance by 1K or 1°C
- its ability to radiate (or conduct) heat
99% of the atmosphere is composed of gases that can’t radiate any significant heat – N2 and O2. As shown in CO2 – An Insignificant Trace Gas? the absorption and emission ability of these gases is more than a billion times less than water vapor and CO2.
So the result is that the atmosphere takes a long time to heat up and to cool down when radiation is involved.
What is important to understand is that the DLR value measured at any one time is dependent on two important factors:
- the temperature profile of the atmosphere above the measurement location
- the concentration of gases that can radiate longwave
So lateral air movements have the ability to cause larger DLR changes. A strong wind blowing colder drier air can reduce the DLR significantly, and a hotter moister wind can increase DLR significantly.
how can you prove that DLR is the back-radiating portion?
i.e. the radiation which originated at the earths surface and got absorbed by some GHG and then being emitted towards the earth surface?
Why do you say that the difference between Day and Night time is quite small in DLR. Isn’t it about 40 W/m2 on these graphs. Seems quite significant at first glance (I don’t know the day-night temperature differences in Darwin though).
PS. Do you have data on local surface temperature to plot on these graphs.
gnbatt
I don’t like the term “back radiation” myself and the general term in use in atmospheric physics is “downward longwave radiation” or DLR.
The DLR is simply a consequence of these 2 points:
1. The atmosphere is above absolute zero
2. The atmosphere includes gases that can radiate at these temperatures
So to track one specific measurement and identify cause and effect.. not particularly worthwhile, or even possible.
The DLR from atmosphere that we measure at the surface is around 300 W/m^2.
If you have a theory about the reason why the atmosphere radiates this amount of energy, you are welcome to explain it (easier than me trying to guess). And then we can review your explanation.
ok, the ‘green house’ or the ‘glass house’ analogy, necessarily says that… it is the ‘backradiation’ which causes the extra warming that we see on the earth’s surface?… so somebody needs to prove that it is indeed ‘backradiation’ and not the radiation ‘trapped’ by GHG’s due to any number of extra-terrestrial sources?
How well established is the sd of the DLR?
Would it be correct to say that GHGs make the atmosphere less transparent to long wave radiations and increasing the GHGs levels in the atmosphere warms it by making it capture a larger part of the upward long wave radiations?
Is the increased DLR just a consequence of the warmer athmospheric temperatures and not the ’cause’ of the warming?
Or do both parts of the process (the increased capture of ULR by the atmosphere AND its increased ability to radiate heat downward) contribute to the warming?
(I am just trying to get my mind around those radiative processes and I am interested in your insights. Thank you for the outstanding work you are doing through this website.)
hunter:
For each dataset on a graph I used the Matlab command to calculate the standard deviation.
SoD,
You said local variations of GHG can affect local DLR significantly.
Can it affect local temperatures as well or is the GHE essentially global in its effects, as a result of the global energy budget?
propater:
The last one is most correct. If a gas can absorb at a wavelength it can also radiate at that wavelength.
If the atmosphere was transparent to the longwave radiation it would not be able to absorb any heat from radiation and the only heat would come from convection/conduction from the surface (see The Hoover Incident ). And it would not lose any heat by radiation.
Because the atmosphere can absorb radiation from the earth it heats up and therefore also radiates. Without the downward portion of this radiation the earth would be much cooler.
Alexandre:
It is always, first of all, a “local” effect on temperatures.
But then heat flows from higher to lower temperatures which is what redistributes heat around the globe – so, for example, the poles radiate more than they receive in radiation, while tropical regions radiate less than they receive. For a little about that last point, see Predictability? With a Pinch of Salt please.. Part One
gnbatt
You won’t find references to “greenhouse analogies” in any articles on this site except for using “greenhouse” gases as a well-known term.
Take a look at the CO2 – An Insignificant Trace Gas? series for more detailed explanation.
s of d,
Thank you.
It would appear that the sd is much larger than the signal of warming that CO2 is creating.
How is this reconciled in the science of doom?
I wouldn’t worry about the standard deviation in that context here. Most of the deviation value here comes from day-and-night cycles. Measuring an increase shouldn’t be much of a problem.
Mait,
That seems a bit convenient, and does not seem to actually reconcile anything.
hunter:
It’s an issue of frequency bands. The sd is from the diurnal cycle, which is easy to separate from a slow multi-decadal secular trend like CO2 increase.
Is the second part actually true for the emission side of DLR. I would think that the DLR value wouldn’t be much different for atmopshere with 90% CO2 concentration and 0,0009% CO2 concentration if the temperature of the gas is the same. Only the amount emitted by a single CO2 molecule would be much higher for lower concentration. It would borrow the energy from other gases through collisions.
Emissivity = absorptivity. Would you say that the absorption of radiation wouldn’t be much different for 90% or 0.0009% CO2 because the CO2 molecules can just transfer the energy to the other components by collision? If you did, you would be wrong. Emission will change little in the center of the band once the absorptivity gets close to 1, but the emission band itself will get wider and the total emission will increase as the concentration of CO2 increases.
You are right, I wrote it in quite a bad way. Actually the amount radiated by an CO2 molecul would be same, but they would just occupy a larger amount of space. The DLR should still stay the same though.
hunter:
Perhaps you are misunderstanding the meaning of “standard deviation”. I think I have confused people by putting it on the graph.
In any case I suspect you are thinking of it as some kind of error estimate?
But in this case it isn’t. It is just a measure of the “spread” of the signal.
SoD,
If the natural spread is much larger than the signal being looked for what is the significance of the difference?
As Spencer points out, small changes in cloud cover can explain the observed changes, for example.
Mait,
Only at the center of the band where emission and absorption are saturated or emissivity = absorptivity = ~1. The width of the emission band would change drastically from 0.0009% to 90%. Since the total emission is the integral of emission as a function of wavelength over all wavelengths, the total DLR from an atmosphere with 90% CO2 would be much higher than for an atmosphere with 0.009% CO2. A quick and dirty calculation using MODTRAN ( http://geoflop.uchicago.edu/forecast/docs/Projects/modtran.orig.html ) at an altitude of 0 km looking up with the 1976 U.S. Standard Atmosphere gives the following net increases in DLR from 0 ppmv CO2 for order of magnitude increases in CO2 starting at 9 ppmv or 0.0009%
concentration(ppmv) DLR (W/m2)
9 12.9
90 22.2
900 32.8
9000 47.3
The assumptions used in the MODTRAN calculations start to break down for high concentrations of CO2 so I didn’t bother trying to calculate 9% and 90%. But I assure you that DLR would continue to increase as the concentration of CO2 increased.
True, I probably added way too many zeros to my numbers. Though is modtran calculator particularly good for this kind of calculation (I think it recalculates surface temperature as well)?
A more refined argument would be that for bandwidths where the atmosphere is nontransparent at concentration1, increasing the concentration doesn’t have any effect on DLR as long as the temperature is constant?
Disclaimer: I’m not saying that increasing CO2 concentrations on earth wouldn’t result in an increase in DLR. My hypothesis is rather that DLR on the surface is highly dependant on the temperature of the lower layers of the atmosphere. That’s why I don’t quite understand why the DLR measurements are not coupled with temperature measurements.
If the natural spread is much larger than the signal being looked for what is the significance of the difference?
sorry- hit the wrong reply button.
Mait:
Increasing the concentration does have an effect on DLR. Emission at any given wavelength is proportional to:
– the concentration of emitters at that wavelength, and
– the Planck function (i.e. proportional to the 4th power of temperature)
Yes, but aren’t we talking about bodies with a contsant volume and mass here. If we decrease the concentration, radiation from particles higher up will reach the surface (so to speak), which means that the radiating “body” would become larger as well.
“If the atmosphere was transparent to the longwave radiation it would not be able to absorb any heat from radiation and the only heat would come from convection/conduction from the surface (see The Hoover Incident ). And it would not lose any heat by radiation.”
Doesn’t that last statement apply to infrared radiation only? As long as the temperature is above absolute zero, the atmosphere must radiate at some wavelengths, no?
Rocco:
There’s no fundamental requirement that a gas has to radiate. In practice, as far as I know, all gases do radiate even though the intensity for some is so low as to be unmeasurable.
For example, N2 has an absorption factor (which equally applies to emission) of 10 billion times less than CO2 in the 4um band. And nothing outside.
So in this area of the band (which is infrared) N2 will radiate at best at 1/5,000,000 of the intensity of CO2. (This takes into account the relative concentration of the 2 gases).
There’s nothing special about infrared/non-infrared. And a gas at atmospheric temperatures will only be able to radiate anything of significance in the infrared bands (>0.7um). Here is a log plot of blackbody spectral intensity for some typical atmospheric temperatures plus higher temperatures we never see (325K/52’C and 350K/77’C):
As you can see the intensity of radiation is extremely low below 1um. For example, for 300K (23’C) the intensity is less than a million million times lower at 1um compared with 4um.
The significance of the planck curves is that if a gas has strong emission lines at these low wavelengths the intensity would still be so low as to be unmeasurable.
scienceofdoom,
You’re neglecting collisional induced absorption (CIA). When, say, a nitrogen molecule collides with another molecule, the electrostatic field of the molecule can be distorted resulting in a dipole moment. A photon can be either emitted or absorbed by this process. The process results in a continuum spectrum. Peak absorption is about 100 micrometers. CIA for oxygen and nitrogen can be neglected at low altitudes where there is significant water vapor present. The CIA for water vapor, which is likely responsible for what’s called the water vapor continuum, is many orders of magnitude higher than for oxygen and nitrogen. But if you have a limb path through the upper atmosphere, CIA for oxygen and nitrogen can’t be neglected.
[…] Darwinian Selection – “Back Radiation” […]
[…] Darwinian Selection – “Back Radiation” […]
Dear SoD.
I am a little late, but i read the site when i need information and facts about specific subjects, and Back radiation was up now.
You state that the back radiation mostly depends on gas concentration and temperature.
It violates the standard explanation that half the upwelling infrared is absorbed and radiated back. According to your statement it is, but because the gasses are heated by the upwelling radiation and/or other sources.
It gives meaning, but then also the effect of CO2 is a “feedback” and not a direct “forcing”.
I just miss some numbers of the W/m2 versus temperature the gasses will provide.
From Fort Peck i have found that the total back radiation follows the formula:
267 + 3.84*Tcelcius W/m2. At least in 2018. The factor 3.84 is higher than expected, but is also a part of more water vapor with temperature.
I noticed that point because when seen from ground level the back radiation is a big player in the ground temperature and the global and local temperature.
I hope for some responces.
-Svend
Svend: See reply below.
Svend: Much confusion occurs because the physics of radiation traveling in our atmosphere has been oversimplified and/or misrepresented. The spectral intensity of radiation emitted (dI_e) from a thin layer of atmosphere of thickness ds is given by:
dI_e = n*o*B(lambda,T)*ds
where: “spectral intensity” is the power/unit area at a particular wavelength, n is the density of emitting molecules (for example molecules/m^3), o is the absorption cross-section at that wavelength, and B(lambda,T) is the Planck function for that local temperature T at that wavelength.
The spectral intensity of the radiation absorbed (dI_a) by the same thin layer of atmosphere of thickness ds is given by:
dI_a = n*o*I*ds
where I is the spectral intensity of the radiation of the same wavelength entering the same thin layer of atmosphere. (ds is chosen to be small enough that we can treat I as a constant, a common practice in numerical integration.)
The total change in spectral intensity, dI, is:
dI = dI_e – dI_a = n*o*[B(lambda,T) – I]*ds
Various forms of this equation are called Schwarzschild’s equation of radiative transfer. (I wrote the Wikipedia article on this subject directed toward people like you and would appreciate comments.) To obtain the total change in power as radiation travels from point A to point B, you must integrate ds over the entire path and then integrate over all wavelengths. For back radiation (DLR), that path starts with zero intensity at the top of the atmosphere and ends at the surface of the planet. (Alternatively, DLR can start with emission from the bottom of a layer of clouds.) OLR starts with blackbody emission from the surface (or clouds) and ends at the top of the atmosphere. Software at the link below will do the numerical integration for you, but you pick a location on the planet (tropics, for example) to define the temperature at various locations in the atmosphere.
http://climatemodels.uchicago.edu/modtran/
The validity of this equation has been thoroughly tested in our atmosphere. It predicts what we observe.
If radiation travels far enough through an isothermal atmosphere, eventually absorption and emission will come into equilibrium and dI will be zero. That equilibrium is reached faster when a high concentration of absorbing/emitting gas (ie GHG) is present and reached first at the wavelengths that interact most strongly (have the largest absorption cross-section, such as 15 um for CO2). At equilibrium, I = B(lambda,T) and the radiation has blackbody intensity. Now the intensity of the radiation does NOT depend on the density of absorbing/emitting molecules their absorption cross-section, only temperature. Planck derived his famous law by assuming radiation in equilibrium with quantized oscillators – and that law applies ONLY to radiation in equilibrium with the medium it is traveling through. At some wavelengths, the thermal infrared emitted by the surface passes directly through the atmosphere to space with negligible absorption (or emission). On the other hand, at the wavelength most strongly absorbed by CO2, 90% of the photons emitted by the surface are absorbed within the first meter – and emission has replaced essentially all of those photons with newly emitted photons. At this wavelength, the OLR has blackbody intensity well into the stratosphere. At all wavelengths, if you go high enough, the atmosphere eventually gets thin enough that a negligible amount of absorption and emission occurs on the way to and from space.
People simplify this complicated physics by referring to one of two extreme situations: “optically thin” and “optically thick” layers of atmosphere. An optically thick layer emits blackbody radiation. An optically thin layer of atmosphere emits in proportion to the concentration of the GHG(s) it contains. In both cases, emission depends on temperature.
PS. Above, I ignored some complications: scattering (which is important for visible light and makes the sky blue) and fact that the above equations only apply to the atmosphere is in “local thermodynamic equilibrium” (which is true for our atmosphere where the vast majority of emission occurs).
Svend wrote: “It violates the standard explanation that half the upwelling infrared is absorbed and radiated back.”
This way of describing DLR suggests you may believe that absorbed OLR is “re-emitted” half upward and half downward. Absorption of a photon of thermal IR (say the 15 um radiation absorbed by CO2) creates an excited state that soon emits the same energy as a photon equally likely to travel in any direction. This is wrong. 99+% of vibrationally excited states are “relaxed” by collisions long before they emit a photon. For example, it takes an average of 1 second for the vibrationally excited state of CO2 to emit a 15 um photon, but the excited CO2 collides with other molecules about 10^9 times per second near the surface. Not all collisions relax an excited state of CO2, but relaxation is vastly faster than emission. So the energy stored when a photon is absorbed almost always converted to kinetic energy (higher local temperature) and not re-emitted.
So how does CO2 ever emit thermal IR? It is excited by collisions. The fraction of CO2 molecules in an excited state at any time is determined by the Boltzmann distribution of energy. A CO2 molecule can be excited and relaxed by collisions millions of times before it emits a single photon. Planck assumed a Boltzmann distribution of energy in his quantized oscillators went he derived Planck’s Law (and the same assumption was used with the Schwarzschild equation). When the fraction of CO2 in excited states is determined by the Boltzmann distribution, “local thermodynamic equilibrium” is said to exist, and emission according to Planck’s Law and the Schwarzschild equation depends only on temperature. When the local radiation field is so intense that the number of CO2 molecules in excited states is significantly increased by absorption, then local thermodynamic equilibrium doesn’t exist and these equations aren’t accurate. Local thermodynamic equilibrium exists in the troposphere and most the the stratosphere, where almost all emission and absorption of thermal IR occurs. Above the stratosphere, collisions are much less frequent and thermal infrared is not in local thermodynamic equilibrium, but there are too few GHGs at this altitude to change OLR and DLR.
Lasers, LEDs, and fluorescent lights create excited states that emit visible light without being hot enough for collisions to create those excited states. But this isn’t what happens in the atmosphere.
When one uses the Schwarzschild equation to calculate radiation transfer, one needs to input the temperature at every altitude. In the stratosphere, the local temperature is controlled by the “radiative equilibrium”. If there is more absorption than emission at an altitude, you can use the absorbed energy to raise the local temperature and recalculate until you find the temperatures where absorption equals emission everywhere in the stratosphere. In the troposphere, another principle applies, because the atmosphere is too opaque to let all of the energy from SWR escape as thermal IR. If radiative equilibrium controlled tropospheric temperature, the temperature would be about 350 K. In that case, however, the lapse rate (temperature decrease with altitude) would make the atmosphere extremely unstable to buoyancy-driven convection. That is why the lapse rate in the troposphere averages about 6.5 K/km. Wherever the atmosphere radiates away too much heat and the lapse rate becomes unstable, convection spontaneously increases to restore a stable lapse rate. Wherever it radiates away too little hear, convection of latent and sensible heat from below will slow. This is called radiative-convective equilibrium. So the temperatures used for radiation transfer calculations in the troposphere are those we currently observe, not temperatures we calculate. AOGCMs, but not simple radiative transfer calculations like MODTRAN, try to calculate both radiative flux and temperature. The fundamental physics of radiation transfer is thoroughly understood and calculable, but convection and precipitation can’t be calculated for grid cells and must be parameterized.
Thanks Frank it helped a lot.
I have some follow up on this statement: “An optically thick layer emits blackbody radiation. An optically thin layer of atmosphere emits in proportion to the concentration of the GHG(s) it contains. In both cases, emission depends on temperature.”
A thick layer emits blackbody radiation. Does that mean it radiates all over like a blackbody, og is the intensiity following a blackbody curve but restricted to the bands where it interacts? I believe it would not radiate much in the atmospheric window band.
Regarding clouds i have pointed my IR thermometer to the sky from time to time, and noticed that a clear sky nearly all time measures below -20c (limit of instrument). When clouds a present i measure from -5 to 15c. I use the result to estimate the hight of the clouds. Every grad differense to ground should give 150m.
That is what got me to speculate on backradiation and what made me investigate the surfrad stations.
-Svend
Svend wrote: “A thick layer emits blackbody radiation. Does that mean it radiates all over like a blackbody, or is the intensiity following a blackbody curve but restricted to the bands where it interacts? I believe it would not radiate much in the atmospheric window band.”
You are correct. Layers of atmosphere that are optically thick at all wavelengths aren’t found in our atmosphere. They are mostly used in simplified models and homework problems for teaching students about the greenhouse effect. Most people vehemently support the use of such models, but I’ve suffered much frustration when simplified models eventually proved inadequate and I didn’t know where to turn. No heat can be transferred through the isothermal optically thick layers found in many problems. Furthermore, no GHE exists in an atmosphere whose temperature is constant with altitude. This is the case on the average in Antarctica. Due to my frustrations with oversimplified models, I choose to start with Schwarzschild’s equation for radiation transfer and deal with the awkwardness of numerical integration with the help of MODTRAN.
IMO, the best thing one can do is ask questions of MODTRAN in addition experimenting with your IR thermometer. I’m not sure what range of wavelengths your IR thermometer is responding to. MODTRAN will tell you at what wavelengths of thermal infrared the atmosphere is “shining down” with blackbody intensity appropriate for the local surface temperature and tell you what GHGs are responsible. You can add clouds. There is a temperature vs altitude graph on the right and curves showing blackbody intensity for various temperatures and wavelengths. You can look up from different heights above the surface and discover that water vapor rapidly becomes less important and that there is a water vapor continuum that weakly emits at all wavelengths near the surface.
If you are skeptical about the accuracy of radiative transfer calculations, see this post from SOD and remember that you are looking at 1965 calculations. Scientists have collect increasingly accurate data about the absorption cross-sections of GHGs at different temperature and pressures and accuracy has increased to the point where the discrepancies between observation and calculation are about the same size as the estimated error in observations. However, no one is publishing simple graphs demonstrating for amateurs how well calculations work today.
https://scienceofdoom.com/2010/11/01/theory-and-experiment-atmospheric-radiation/