I really dislike going down this road, but it’s articles like this (Global Warming and Settled Science, in American Thinker) from today that confuse the global warming science debate and end up wasting my time as I get e-mails asking for comment.
Now, those familiar with my views KNOW I’m a huge critic of the climate models used to predict global warming. I believe there are serious biases in them. But most of the physics they involve are pretty good. Yet, it only takes one component (e.g. a cloud parameterization) to change a model’s response to increasing CO2 from catastrophic warming to benign warming.
So, when author Andre Lofthus throws around some radiation physics concepts and claims that the atmosphere cannot warm from more CO2 because (basically) the CO2 absorption bands are already 100% saturated, well, I have to respond. I’ve already done that once by e-mail this morning, but I’m sure more requests are coming, so I’m going to try to nip it in the bud here. (Awhile back a NASA story about CO2 cooling [GASP! COOLING!] in the upper atmosphere led to a Slayer blog post that I was getting e-mails about for months after. As if we didn’t already know that CO2 strongly cools the upper atmosphere. Geez.)
So, here we go again…
It doesn’t matter even if the CO2 absorption bands are 100% opaque to the transmission of IR radiation from the surface to the top of the atmosphere…adding more CO2 still causes a warming tendency in the lower atmosphere (and cooling in the upper atmosphere).
There are two reasons for this. The first is that pressure broadening of the absorption lines leads to the absorption lines influencing much wider ranges of wavelengths, which are not “saturated”, that is, not 100% opaque.
The second reason is that, even if there is 100% opacity (which there cannot be for gaseous absorption), if you add more and more CO2, the effective radiating altitude to space goes ever higher, which is colder, which means less IR radiation, which means a warming tendency for the lower atmosphere.
In other words, 100% opacity of the atmosphere to IR in the CO2 absorption bands — even IF it existed — would not prevent a lower atmospheric warming tendency (and upper atmospheric cooling tendency) in response to further increases in CO2. This is why the surface of Venus is hot enough to melt lead…the atmosphere is so strongly radiatively insulated against loss of IR to space that the temperatures climb until radiative energy balance is achieved. Models have quite adequately explained the temperature profile on Venus with the known atmospheric composition, just as they explain the temperature profile here on Earth.
And until the Slayers and Mr. DC and Mr. JP come up with an alternative time-dependent model that converges to the observed temperature profile, they are doing little more than hand-waving.
The best direct observational evidence I have seen that increasing CO2 causes less IR to escape from the Earth to space is the NASA AIRS CO2 retrievals. These retrievals measure and quantify the IR “trapping” effect (a poor term, I admit) at various IR wavelengths from the AIRS satellite measurements, which as I understand it, agrees quite well with established theory:
The real question for global warming is, what are the feedback responses of the climate system to the resulting warming tendency? I believe it is relatively strong negative feedback, and little net warming. Maybe with some deep-ocean heat storage thrown in (it won’t matter to anyone or anything if the deep ocean warms from 40.2 deg. F to 40.3 deg. F in the next 50 years).
Not all of global warming theory has to be wrong…just one weak link in the chain can cause the whole house of cards to fall. Oops…I think I just mixed my metaphors again. Oh well, that’s the way the cookie bounces.