On Lunar Madness and Physics Basics, one commenter asked a very good question in response to a badly phrased answer from me.
He originally asked:
You agree that if earth had 100% nitrogen atmosphere (a non greenhouse gas), the “average” temperature of earth would be different (I’m not entirely sure it wouldn’t be lower) from the 255 K blackbody radiation would suggest.
I got the meaning wrong and said “right, it would be 255K” and wasn’t very specific with what I meant, which was a mistake as the article in question had just explained everything wrong with averages..
He responded with an interesting example of a fictional Latvia, where Latvia got all the solar energy (somehow) and the rest of the world none, and showed that the average temperature of the earth was 0.5K or almost -273°C (and Latvia was quite steamy).
And after I had written a comment as long as a post I thought it was probably a subject that would be worth making a post out of..
“Nice example, and I guess as I’ve been explaining everything that’s wrong with averages I should have been more careful in my explanation.
We will consider a real earth with an albedo..”
Absorbed Solar Radiation
The absorbed solar radiation has an average of 239 W/m2. This is averaged over the surface of the earth, which is 5.1 x 108 km2 = 5.10 x 1014 m2.
The incoming absorbed energy is therefore 1.2 x 1017 W.
Note: this average of 239 is a measured value of incoming – reflected solar radiation, divided by 4 for spatial geometry reasons, see The Earth’s Energy Budget – Part One. The total of 1.2 x 1017 W also equals the TSI of 1367 W/m2 x “disc area” of the earth x (1-albedo).
This is also currently the measured average outgoing longwave radiation (OLR) at the top of atmosphere, within instrument error.
We’ll come to the energy-grabbing Latvia later (sorry Latvian’s wasn’t my idea).
It’s a typical earth in all other respects but a cosmic being has just “hoovered up” the trace gases like CO2, CH4, NO2 and turned water vapor into a non-absorbing gas. (If water vapor was also hoovered up the oceans and lakes would be a ready source of water vapor and within a month or two the atmosphere would have the same water vapor as before).
As a result radiation >4μm just goes right through it. That is the whole point about radiation – if nothing absorbs it, it keeps on going.
The radiation emitted from the earth just after the Hoover incident is 396 W/m2. Or, putting it another way, the total radiation emitted from the earth’s surface is 2.0 x 1017 W.
This total radiation is calculated by adding up the radiation from every square meter on the earth. And the average is simply the total divided by the area.
Note. In fact across a day and a year this value will change. Even year to year. So 2.0 x 1017 W is just the annual average across an appropriate time period. But it is appreciably higher than the absorbed solar 1.2 x 1017 W.
Energy Loss through Radiation and Climate Response
The net radiation loss from the planet starts at 0.8 x 1017 W = 2.0 x 1017 – 1.2 x 1017
Who knows what kind of climate response that would generate? And the time for the total response would also depend on how well-mixed the oceans were (because of their large heat capacity), but rather than trying to work out how long it will take, we can say that the earth will cool down over a period of time.
And no matter what happens to convection, lapse rates, and rainfall this cooling will continue. That’s because these aspects of the climate only distribute the heat. Nothing can stop the radiation loss from the surface because the atmosphere is no longer absorbing radiation. They might enhance or reduce the cooling by changing the surface temperature in some way – because radiation emitted by the surface is a function of temperature (proportional to T4). But while energy out > energy in, the climate system would be cooling.
Clouds and Ice Sheets
It’s possible (although unlikely) that all the clouds would disappear and in which case the net incoming – reflected radiation might increase, perhaps to 287 W/m2. (This value is chosen by measuring the current climate’s solar reflection of clouds, see Clouds and Water Vapor).
If that happened, total absorbed energy = 1.5 x 1017 W.
However, as the earth cools the ice sheets will increase and there’s no doubt that the albedo of the earth’s surface will increase so who knows what exactly would happen.
But it seems like the maximum absorbed solar radiation would at most go from 1.2 x 1017 W to 1.5 x 1017 W and more likely it would reduce below 1.2 x 1017 W.
A New Equilibrium
Eventually the outgoing radiation would approximately match the incoming radiation and temperature would settle around a value – this would take centuries of course, maybe 1000s of years..
And depending on the ice sheet extent and whether any clouds still existed the value of outgoing radiation might be around 1.0 – 1.5 x 1017 W. This upper value would depend on the ice sheets not growing and all the clouds disappearing which seems impossible, but it’s just for illustration.
Remember that nothing in all this time can stop the emitted radiation from the surface making it to space. So the only changes in the energy balance can come from changes to the earth’s albedo (affecting absorbed solar radiation).
And given that when objects emit more energy than they absorb they cool down, the earth will certainly cool. The atmosphere cannot emit any radiation so any atmospheric changes will only change the distribution of energy around the climate system.
What would the temperature of the earth be?
I have no idea.
But let’s pick the 1.2 x 1017 W as our average incoming less reflected solar energy, and so when “equilibrium” is reached the earth’s total surface radiation will be at this value. (Strictly speaking equilibrium is never truly the case, but we are considering the case where measured over a few decades the average outgoing radiation is 1.2×1017 W).
Suppose that all around the world the temperature of the surface was identical. It can’t happen, but just to put a stake in the ground, so to speak.
Average outgoing radiation / surface area = 235 W/m2 – (that’s because I originally wrote down 1.2 x 1017 instead of 1.22 x 1017 and so the rounding error has caused a change from 239, but it doesn’t particularly matter, more a note for those in the “rounding police”).
And with a longwave emissivity close to 1 for most of the earth’s surface, this would be a temperature of 254K.
The Energy-Grabbing Latvians
As I mentioned before Latvians and Latvophiles, this wasn’t my idea. And anyway, it probably wasn’t your fault. I’ll adjust the commenter’s numbers slightly to account for the earth’s albedo.
In his example, mythical-Latvia has a surface area of 10,000 km2 = 1010 m2. And mythical-Latvia absorbs all of the energy with none left for the rest of the earth. The rest of the earth is at a temperature of 0K and Latvia is at a temperature of 5080K which means it radiates at 1.2 x 107 W/m2. Therefore the radiation from the whole earth = 1.2×1017 W and so the earth is in equilibrium (with the solar radiation absorbed), but the average temperature of the whole earth = 5080 * 1010 / 5.1×1014 = 0.1K.
And our commenter has nicely demonstrated the same point as in Lunar Madness and Physics Basics – you can have the same radiation from a surface with totally different average temperatures.
The Maximum “Average” Temperature
So we have had two approaches to calculating our equilibrium hoovered atmosphere, one with a temperature of 254K (-19°) and one with a temperature of 0.1K (-273°C) – both quite unrealistic situations it should be noted.
What’s the maximum “average” temperature? As already noted, at the wavelengths the earth radiates at (>4μ) it is quite close to a blackbody.
In this case, the maximum “average” temperature can’t be higher than what is known as the “effective blackbody temperature”, Teff.
This value, Teff, is a common convention to denote the effective radiating temperature of a body. This is just the temperature-radiation conversion from the Stefan-Boltzmann equation, Teff = (E/σ)1/4 – the rewritten version of the more familiar, E = σT4. It simply converts the energy radiated into a temperature. So 235 W/m2 = 254K.
It doesn’t mean, as already explained, that Teff is the “average” temperature. The “average” temperature (arithmetic mean) can be quite different from Teff, demonstrating that average temperature is a troublesome value.
I created confusion using the concept of Teff in my comment in the earlier article. By saying the earth would be at 255K without a radiatively absorbing atmosphere I really meant that in that situation the earth’s surface would be radiating (averaged globally annually) 239 W/m2.
The Earth Without an Absorbing Atmosphere
With conventions out of the way – if the atmosphere didn’t absorb any terrestrial radiation the radiation from the surface would slowly fall from its current annual global average of 396 W/m2 to around 240 W/m2.
The climate would undergo dramatic changes of course and no one can say exactly what the equilibrium “effective blackbody radiating temperature” would be as we don’t know how much solar radiation would be reflected in this new climate. Clouds, ice – and aerosols – all play a part in reflecting the solar radiation from the atmosphere and the surface, and if these change the amount of energy absorbed changes.
But without an atmosphere that absorbs longwave radiation there is no way that the radiation from the surface can be greater than the radiation from the top of the atmosphere. And that means that eventually the emission of radiation from the surface would be approximately equal to the absorbed solar radiation.
Therefore, the value of global annual average radiation might be 290W/m2 (unlikely), or it might be less than 239W/m2 (more likely).
The world would be much colder.
With an annual surface radiation of 239W/m², the “average temperature” would be -18°C or colder.