What Causes the Large Swings in Global Satellite Temperatures?

March 16th, 2012

One of the most frequent questions I get pertains to the large amount of variability seen in the daily global-average temperature variations we make available on the Discover website. From Aqua AMSU ch. 5, these temperatures can undergo wide swings every few weeks, leading to e-mail questions like, Is the satellite broken?

Unusually good examples of these have occurred over the last couple months. In the following plot I took from today, we see January and February of 2012 experiencing two full cycles of temperature variations of about 0.5 deg. C (click for large version):

We have observed this behavior ever since John Christy and I started the satellite-based global temperature monitoring business over 20 years ago.

These temperature swings are mostly the result of variations in rainfall activity. Precipitation systems, which are constantly occurring around the world, release the latent heat of condensation of water vapor which was absorbed during the process of evaporation from the Earth’s surface.

While this process is continuously occurring, there are periods when such activity is somewhat more intense or widespread. These events, called Intra-Seasonal Oscillations (ISOs) are most evident over the tropical Pacific Ocean.

During the convectively active phase of the ISO, there are increased surface winds of up to 1 to 2 knots averaged over the tropical oceans, which causes faster surface evaporation, more water vapor in the troposphere, and more convective rainfall activity. This above-average release of latent heat exceeds the rate at which the atmosphere emits infrared radiation to space, and so the resulting energy imbalance causes a temperature increase (see the above plot).

During the convectively inactive phase, the opposite happens: a decrease in surface wind, evaporation, rainfall, and temperature, as the atmosphere radiatively cools more rapidly than latent heating can replenish the energy.

(As I keep emphasizing, a temperature change is caused by an imbalance between energy gain and energy loss. You can cause warming either by increasing the rate of energy gain, or by decreasing the rate of energy loss. This is how the “greenhouse effect” works, by reducing the rate of radiative energy loss by the surface).

There are observed cloud and radiation budget changes associated with ISOs, as described in our 2007 paper which analyzed the average behavior of the ISO over a 6 year period (see Figs. 1, 2, and 3 in that paper). The main mode of ISO activity is called the Madden-Julian Oscillation.

How Can Rainfall Cause Warming?

I will admit, even as a fresh PhD researcher, the idea that rainfall causes heating seemed counter-intuitive. This is because we are used to the cooling effect of rain when it falls to the surface, which is where we live. But high up in the atmosphere where the rain forms, huge amounts of heat are given up in the process. Just as water evaporating from your skin feels cool, that extra heat is given up when the water condenses back into liquid.

The most vivid example of this process is the warm core of a hurricane, which is heated by the rainfall occurring around the hurricane eye. What actually happens is that the latent heat release within clouds causes localized warming which is almost immediately “relieved” by ascending motion (convective updrafts, which frequent flyers are familiar with).

Because rising air cools, there is little net temperature change in the ascending parcels of air…but all of that rising air also forces air outside of the cloud systems to sink, causing “subsidence warming”. That is where most of the actual temperature increase takes place. The hurricane eye is the most extreme example of this process, where subsiding air gets concentrated in a relatively small region and the temperature can rise many degrees. More commonly though (such as with the ISO phenomenon), the subsidence warming gets spread over hundreds or thousands of km of atmosphere, and the temperature rise is only a fraction of a degree.

Slaying the Slayers with the Alabama Two-Step

March 14th, 2012

last edited for clarity 11:05 CDT 3/14/2012

A recent article by S. Fred Singer in American Thinker entitled Climate Deniers are Giving Us Skeptics a Bad Name has caused a resurgence of attacks against those of us who believe that the Earth does indeed have a “greenhouse effect” which makes the Earth’s surface warmer than it would otherwise be.

As a result of Fred’s article, angry E-mails have been flying like arrows, and belligerent blog posts have littered the landscape. (Well, maybe that’s a bit of an exaggeration. But I enjoyed writing it.)

In the American Thinker article, Fred has taken the tack that the “denier” label (which left-leaning folk seem to love calling us) should be applied to those who deny the so-called greenhouse effect (which states that infrared absorbing gases in the atmosphere perform a surface-warming function) and should not be applied to those of us who are skeptical of how much humans have warmed the climate system by slightly enhancing the natural greenhouse effect with more carbon dioxide from burning of carbon based fuels. The anti-greenhouse crowd’s bible seems to be the book, Slaying the Sky Dragon – Death of the Greenhouse Gas Theory.

The arguments between us and the anti-greenhouse advocates often become technical and devolve into disputes over the 1st or 2nd Law of Thermodynamics, whether photons really exist, whether a carbon dioxide molecule which absorbs IR energy immediately releases it again, whether outer space is an ‘insulator’, etc. Lay people quickly become overwhelmed, and even some of us technical types end up feeling ill-equipped to argue outside our areas of expertise.

I believe the fact that infrared-absorbing gases warm the surface and lower atmosphere can be easily demonstrated with 2 simple steps. The first step is in the realm of everyone’s daily experience. The second step is based upon satellite measurements of the Earth giving off infrared energy, measurements which were first made over 40 years ago.

The Alabama Two-Step

STEP 1:
Temperature is determined by rates of energy gain and energy loss. It does not matter whether we are talking about the human body, a car engine, a pot of water on the stove, or the climate system. The temperature (and whether it is rising or falling) is determined by the rates of energy gain and energy loss. In the case of the climate system, the Earth receives energy from the sun (primarily at visible wavelengths of light), and loses energy to outer space (primarily at infrared wavelengths). A temperature rise can occur either from (1) increasing the rate of energy gain, or (2) decreasing the rate of energy loss. The greenhouse effect has to do with the 2nd of these possibilities.

STEP 2:
Infrared absorbing gases reduce the rate at which the Earth loses infrared energy to space. Satellite measurements of the rate at which the Earth loses infrared energy to space have been made as early as the 1970’s, from the NASA Nimbus 4 spacecraft. The following plot shows the IR intensity (vertical axis) as a function of IR wavelength (horizontal axis). The area under the jagged curve is proportional to the rate of energy loss to space. Note that at the wavelengths where water vapor, carbon dioxide, and ozone absorb and emit IR energy, the rate of energy loss by the Earth is reduced.

Now, lets take Steps 1 and 2 together: If you add more of these “greenhouse gases” and nothing else changes then the rate at which the Earth, as a whole, loses energy to space is reduced. This must lead to a warming tendency, at least at the surface and lower atmosphere (the upper atmosphere will actually experience a cooling effect).

If your head is already exploding at this point, let’s use the (admittedly imperfect) analogy of a thermal IR image of a house at night, which shows how the windows and poorly insulated parts of a house are points of greater IR energy loss:

When you add insulation to your house, you reduce the rate of energy loss in the winter, which will raise the temperature inside the house (all other things being the same), while at the same time reducing the temperature of the exterior of the house. Similarly, greenhouse gases provide “radiative insulation” to the climate system, raising the temperature of the surface and lower atmosphere, while lowering the temperature of the middle and upper atmosphere.

The above analysis is, I believe, consistent with the views of MIT’s Dick Lindzen. It clearly demonstrates that IR absorbing gases (greenhouse gases) reduce the Earth’s ability to cool to outer space. No amount of obfuscation or strawman arguments in the comments section, below, will be able to get around this fact.

But HOW MUCH Warming?

The question of how much warming will result from adding carbon dioxide to the atmosphere is what we skeptics are skeptical of. The climate system is amazingly complex, and the IPCC position that elements within the climate system (especially clouds) will change in ways which amplify the resulting small warming tendency is highly questionable, to say the least. If the climate system instead acts to reduce the warming, then anthropogenic global warming (AGW) becomes for all practical purposes a non-issue.

This represents what I believe to be the simplest description of how greenhouse gases cause warming of the surface. It bypasses all of the esoteric discussions and instead deals only with observations, which I believe cannot be easily explained any other way:

FIRST, warming can be caused by a decrease in the rate of energy loss by the climate system (or any other system, for that matter).

SECOND, IR absorbing gases are observed from satellites to reduce the rate of energy loss to space.

THEREFORE, adding more IR absorbing gases will cause a warming tendency.

QED.

Again I emphasize, however, the above simple argument is necessarily true only to the extent that all other elements of the climate system remain the same, which they will not. These other changes are called ‘feedbacks’, and they can either make or break theories of global warming and associated climate change.

Regarding the Inevitable Questions…

1. Yes, the CO2 absorption bands are already mostly saturated…but the wings of those bands are not, as is evident in the above graph of satellite measurements. This saturation effect partly explains why the approximate 40% increase in atmospheric CO2 since pre-industrial times has resulted in only a 1% decrease in the rate of IR loss by the Earth to space (theoretically calculated). This is already accounted for in the climate models used by the IPCC.

2. Yes, convection is indeed a major mechanism of heat loss from the Earth’s surface. It greatly reduces the surface temperature of the Earth (and, by energy conservation, greatly increases the temperature of the middle and upper troposphere). And my view is that convective effects are what will cause feedbacks to minimize surface warming from CO2 increases. Climate models already contain convection…if they didn’t the modeled surface temperature of the Earth would average around 140 deg. F.

The global warming narrative advanced by the IPCC involves a chain of physical processes which must all be true in order for their conclusions to be true. The existence of the greenhouse effect is, in my view, one of the stronger links in the chain. Feedbacks are the weakest link.

Gleick’s Watergate Too

March 11th, 2012

I don’t usually blog about posts on other blogs, but I’m making an exception for Steve McIntyre’s excellent analysis showing how similar Peter Gleick’s actions were to the Watergate burglary, primarily because both were covert attempts to obtain financial donor lists from the ‘enemy’.

Given Gleick’s expertise in water science, I think we should re-brand the event “Watergate Too”.

A Little Pollution Saves Lives

March 7th, 2012

I’ve been studying up to help out on a future John Stossel segment which will likely address the EPA’s overreach in trying to reduce pollution to vanishingly small levels.

Of course, if we could do such a thing at reasonable cost, and there were clear benefits to human health and welfare, then such a goal might make sense. But what was once a noble and achievable goal to clean up most of our air and water pollution has become an end in itself: to keep making things cleaner and cleaner, no matter the cost.

I remember driving through the Gary, Indiana industrial complex in the 1960s. The air pollution was simply unbelievable. You could not escape the stench and smoke; rolling up the car windows did not help.

I also remember the Cuyahoga River fire of 1969. There is no question that the EPA has greatly helped…but we must remember that the EPA was the result of the public’s realization that is was time to start cleaning up some of our messes.

The fundamental problem, though, is that the EPA was given what amounts to limitless power. They do not have to show that their regulations will not do more harm than good to human health and welfare. Now, I will admit they do provide periodic reports on their estimates of costs versus benefits of regulations, but the benefits are often based upon questionable statistical studies which have poor correlations, very little statistical signal, and little cause-and-effect evidence that certain pollutants are actually dangerous to humans.

And the costs of those regulations to the private sector are only the direct costs, because indirect costs to the economy as a whole are so difficult to assess.

Even if some (or even all) pollutants present some risk to human health and welfare, it makes no sense to spend more and more money to reduce that risk to arbitrarily low levels at the expense of more worthy goals.

Economists will tell you that society has unlimited wants but only limited resources, and so we must choose carefully to achieve maximum benefits with minimum cost. It is easy to come up with ideas that will “save lives”; it is not so easy to figure out how many lives your idea will cost. The same is true of government jobs programs, which only create special interest jobs at the expense of more useful (to the consumer) private sector jobs.

The indirect costs to the economy can be the greatest costs, and since we know very well that poverty kills, when we reduce economic prosperity, people — especially poor people — die.

Income is directly proportional to longevity; you cannot keep siphoning off money to chase the impossible dream of pure air and pure water, because those are simply not achievable goals. It makes absolutely no sense to give a government agency unlimited authority to regulate pollution to arbitrarily low levels with no concern over the indirect cost to society.

But that’s what we have with today’s EPA. Several years ago I gave an invited talk at a meeting of CAPCA, the Carolinas Air Pollution Control Association. There was also an EPA representative who spoke, and to everyone’s amazement stated something to the effect that “we can’t stop pushing for cleaner and cleaner air”. Eyebrows were raised because in the real world, such a stated goal ignores physical and economic realities. There is no way to totally avoid pollution, only reduce it.

How far we reduce it is the question, and so far the EPA really does not care how many people they might kill in the process. Or, they are too dumb to understand such basic economic principles.

Let’s take the example of fine particulate matter in the air (the so-called PM2.5, particles less than 2.5 microns in size), which the EPA recently decided is unsafe at any level (“no threshold” at which it is safe). Let’s take a look at a satellite estimate of the global distribution of this “pollution”:

You will note that the most “polluted” air occurs where almost no one is around to pollute: in the deserts. This is because wind blowing over bare soil causes dust particles. If you really are worried about fine particulate air pollution, do not go outside on a windy day.

Also note in the above map the blue to cyan areas, which have concentrations below 10 micrograms per cubic meter. Those are levels the World Health Organization has previously deemed safe. The western United States is largely below that level. If one examines the monitoring station data west of Denver, there is no correlation between changes in fine particulate matter pollution and deaths, but the oft-quoted Pope et al analysis of those data lumps all of the U.S. data together and finds a weak statistical relationship between deaths and changes in fine particulate pollution.

The EPA then concludes that NO level of such pollution is safe, even though the data from the western U.S. suggests there is a safe level.

As a scientist who deals in statistics and large datasets on almost a daily basis, I can tell you it is easy to fool yourself with statistical correlations. The risk levels that epidemiologists talk about these days for air pollution are around 5% or 10% increased risk compared to background. This is in the noise level compared to the increased health risk from cigarette smoking, which is more like 1,500%. When statistical signals are that small, one needs to look far and wide for confounding factors, that is, other more important variables you may not have accounted for to the accuracy needed to make any significant conclusion.

Actually establishing a mechanism of disease causation is not required for the EPA to regulate; just weak correlations and a board of “independent” scientific advisers who are themselves supported by the EPA.

Of course, when Congress makes any attempt to rein in the EPA, there are screams that decades of air pollution control progress is being “rolled back”. Bulls&!t. The new pollution regulations now being considered have reached the point of diminishing returns and greatly increased cost.

This week I was with about 250 representatives of various companies involved in different aspects of growing America’s (and a good part of the world’s) food supply. The most common complaint is the overreach of government regulation, which is increasing costs for everyone. The extra costs have been somewhat shielded from the consumer by industry, but they told me that sharply increasing food prices are now inevitable, in fact they are already showing up in grocery stores.

And I haven’t even mentioned carbon dioxide regulations. Even if we could substantially reduce U.S. CO2 emissions in the next 20 years, which barring some new technology is virtually impossible, the resulting (theoretically-computed) impact on U.S or global temperatures would be unmeasurable….hundredths of a degree C at best.

The cost in terms of human suffering, however, will be immense.

It is time for the public to demand that Congress limit the EPA’s authority. The EPA operates outside of real world constraints, and is itself an increasing threat to human health and welfare — the very things that the EPA was created to protect.

A Technical Apology to Juliet Eilperin

March 7th, 2012

It has been close to 2 weeks since WaPo reporter Juliet Eilperin interviewed me for an article about why some climate scientists become so outspoken on climate issues. After that interview, I had a bad feeling (as I always do) about how my views would be portrayed in the mainstream media, and so I had a blog post which ended:

“It will be interesting to see how Juliet represents my views. I predict spin. But if she accurately and fairly represents my views, I will apologize for doubting her.”

So how did she do in her article, “In climate wars, advocacy by some researchers brings risks“?

Well, she did not actually “spin” my views, so for that unusual gift of journalistic balance, I will gladly apologize for doubting her.

On a first, quick read the article seemed more balanced that I was expecting (keeping in mind my expectations are usually pretty low in such matters). So, in response to her e-mail to me a couple mornings ago, I said my apology would be forthcoming when I get a chance (I was busy with a corn growers-related meeting in Kansas City).

Now, after re-reading the article, I realize that while she did not actually misrepresent my views…she instead simply did not bother to represent most of what I said.

On the positive side, she accurately described me as one of the scientists who questions the level to which humans are responsible for recent climate change. A great step in the right direction, since we are not “climate deniers”. In fact, we believe more in climate change than the anthropocentric scientists who believe only humans can change climate.

But that was it. The meat of what I told her was ignored, which was on the topic of why scientists on both sides of the global warming debate choose to speak out publicly.

She provided several paragraphs alluding to why scientists on the other side of the issue speak out, but nowhere could I find reasons why WE speak out.

I had told her that ill-conceived energy policies that hurt economic growth kill poor people. Was that not a sufficiently interesting thing to report on? Or, was she afraid that anyone reading a statement like that might actually start thinking for themselves about the unintended consequences of carbon dioxide regulation?

Instead, her article continues the tired tradition journalists have of making the “consensus” scientists sound like they are the only ones who have the best interests of humanity at heart, while continuing to ignore the reasons why I (and others like me) feel WE are really the ones who have the best interests of humanity at heart. I believe we are the ones who own the moral high ground.

She gave more ink to someone from the Union of Concerned Scientists, a misleading name for an organization if I ever heard one. You remember them, they are the ones who even let dogs join, as long as they have a credit card.

So, Juliet, your article was a step in the right direction. But if you are going to have a balanced article about why scientists on both sides of the global warming issue speak out, then please, represent both sides.

UAH Global Temperature Update for February 2012: -0.12 deg. C

March 2nd, 2012

The global average lower tropospheric temperature anomaly cooled a little more in February, 2012, again not unexpected for the current La Nina conditions in the tropical Pacific Ocean (click on the image for the full-size version):

The 3rd order polynomial fit to the data (courtesy of Excel) is for entertainment purposes only, and should not be construed as having any predictive value whatsoever.

Here are the monthly stats:

YR MON GLOBAL NH SH TROPICS
2011 1 -0.010 -0.055 +0.036 -0.372
2011 2 -0.020 -0.042 +0.002 -0.348
2011 3 -0.101 -0.073 -0.128 -0.342
2011 4 +0.117 +0.195 +0.039 -0.229
2011 5 +0.133 +0.145 +0.121 -0.043
2011 6 +0.315 +0.379 +0.250 +0.233
2011 7 +0.374 +0.344 +0.404 +0.204
2011 8 +0.327 +0.321 +0.332 +0.155
2011 9 +0.289 +0.304 +0.274 +0.178
2011 10 +0.116 +0.169 +0.062 -0.054
2011 11 +0.123 +0.075 +0.170 +0.024
2011 12 +0.126 +0.197 +0.055 +0.041
2012 01 -0.090 -0.057 -0.123 -0.138
2012 02 -0.116 -0.014 -0.217 -0.281

Progress continues on Version 6 of our global temperature dataset, which will have a better adjustment for drift of the satellites through the diurnal cycle, and an improved calibration procedure for the older MSU instruments (pre-1998).

Daubert and the Admissibility of Climate Models as Evidence in a Court of Law

February 29th, 2012

I get involved in so many things, sometimes I forget to let people know when something significant happens.

A few months ago, a paper I co-authored with attorney Brooks Harlow appeared in Energy Law Journal entitled: An Inconvenient Burden of Proof? CO2 Nuisance Plaintiffs Will Face Challenges in Meeting the Daubert Standard

I had always wondered how climate models, which are the ultimate source of “evidence” of anthropogenic global warming, could meet the rather stringent guidelines that judges use when deciding whether expert testimony should be allowed in court. Then, one day I received a phone call from an attorney (Brooks Harlow) who had been wondering the same thing, and we agreed to collaborate on a paper for submission to a law journal.

According to Wikipedia, “The Daubert standard is a rule of evidence regarding the admissibility of expert witnesses’ testimony during United States federal legal proceedings.

While the Supreme Court’s decision in Connecticut v. AEP effectively shut the federal court doors to federal common law nuisance claims, about half of the states rely on Daubert, and CO2 litigation can be expected in the coming years in various states.

There are 5 relevant standards in the Daubert test of admissibility, but which the judge does not necessarily have to follow rigorously. Again from Wikipedia:

  • 1. Empirical testing: the theory or technique must be falsifiable, refutable, and testable.
  • 2. Subjected to peer review and publication.
  • 3. Known or potential error rate.
  • 4. The existence and maintenance of standards and controls concerning its operation.
  • 5. Degree to which the theory and technique is generally accepted by a relevant scientific community.
  • In my opinion, two of the standards above (#1 and #3) are not met by climate models. In standard #1, climate models are not testable in any rigorous sense in what they are being used for: prediction of future climate change. We will not know whether they are right or wrong until the future happens.

    And even then, we can’t be sure that warming which ends up being observed in nature occurred for the same reason the models predicted it. Just because they create average weather that is somewhat realistic does not mean they can tell us how average weather might change in the future.

    And, contrary to what you might have heard, there is no obvious “fingerprint” of human warming. Natural warming from a slight change in cloud cover would look the same.

    Also, the oft-mentioned missing “hot spot” of warming in the upper troposphere would be the theoretically expected result of ANY warming; its presence (or absence) does not demonstrate causation as to the source of the warming.

    After all, the twenty-something climate models are all over the map in their predictions, and if one or two happen to be correct, what about all the other climate models which are also based upon “physical principles”, but which were wrong in their predictions? At some point, we have to admit that given enough different model predictions, one or two models are bound to be close to correct just by chance.

    Standard #3, which I also believe is not met, is related to standard #1 just discussed: the methodology should have a known error rate. In other words, how likely is a climate model to be correct based upon its success in previous predictions?

    “Predictions” of the past (e.g. what happened in the 20th Century) don’t really count because those are more exercises in curve fitting than prediction. What if the modelers did not know what the 20th Century temperature variations looked like, and were asked to use models to “hindcast” what happened. Would they be able to? Not likely.

    And since future warming (if it occurs) is a one-of-a-kind event, we won’t know for many years whether any of the climate models are correct even once, let alone for a statistical sample of independent predictions.

    If the climate system is always changing anyway, as some of us believe, you have a 50% chance of being right in a prediction of a warming trend by just flipping a coin.

    While some might also argue whether one or more of the other Daubert standards are not met, I thought I would address the two most obvious ones.

    I yearn for the day when the “scientific consensus” defenders are cross-examined in court. No longer will they be allowed to get away with demanding they be believed just because they are experts. That kind of attitude does not get you very far in court.

    Is the fight against global warming alarmism hopeless?

    February 26th, 2012

    NOTE: The following light-hearted and playful editorial was whipped up after seeing a new Washington Post editorial on their Post(-Normal)Opinions page entitled Is the fight against global warming hopeless? I took the text of that article (which I encourage you to read first) and added some creative modifications. Only a few of the sentences were left intact, which are their creations, not my own…although I doubt the WaPo editorial board will agree with the context I have used them in. Snicker.

    Is the fight against global warming alarmism hopeless?

    IS THE FIGHT against global warming alarmism hopeless? It can seem so. The long-term threat to humanity comes from fears that carbon dioxide, which is necessary for life on Earth to exist, will lead to damaging energy policies which kill perhaps millions of poor people around the world each year. Fortunately, after decades of effort, only about one-tenth of America’s energy mix comes from renewable sources that don’t produce life-enhancing carbon dioxide, and which are so expensive they reduce prosperity for all.

    But two policies could allow inefficient, wealth-destroying carbon-free technologies to try to catch up to their less expensive competitors. One is aimed at greenhouse substances that clear out of the atmosphere after a few years, months or even days (as if the climate system really cares than much about them). Cutting back the emission of soot and ozone gases such as methane (sic) could reduce the world’s warming by an unmeasureable amount over the next few decades. Adding hydrofluorocarbons — another class of short-lived pollutants — to the list wouldn’t really help to delay the approach of temperature thresholds beyond which global warming could be catastrophic, since those thresholds are entirely in the realm of fanciful theories anyway.

    Alarmists believe that reducing these emissions is relatively cheap, especially when the benefits to health are factored in — but at the exclusion of the dangers to health of the reduced prosperity which would also result. For example, primitive cooking stoves in developing countries produce much of the world’s soot; alarmists think using more efficient ones would prevent perhaps millions of deaths from respiratory illness, as if poverty can be alleviated by giving poor people a solar cooker.

    Methane, meanwhile, is the primary component of natural gas — a commodity that pipeline or coal-mine operators could sell if they kept it from escaping into the atmosphere. Researchers have curiously concluded that global crop yields would rise…a speculative and even hypocritical claim considering the known benefits to photosynthesis of adding more CO2 to the atmosphere from fossil fuel burning.

    Coordinating an effective international effort to cut funding to long-lived climate alarmism enforcers will be the hardest task. Science institutions worldwide have spoken out on the need to address global warming, despite no scientist really knowing how much of past warming (which ended ten years ago) is natural versus manmade, and despite those institutions knowing virtually nothing about the underlying science.

    Climate alarmists will waste more than just American money. Regulators in the developing world push to enforce stronger air-pollution rules, which expands the role of government and provides job security for bureaucrats, while ignoring the downside of diverting too much of the taxpayers’ money away from other, more worthy goals.

    Since many of the health benefits of fossil fuels have been taken for granted by people, politicians are too eager to cut carbon dioxide emissions, without realizing there are very good reasons that we use carbon-based fuels.

    One development that promises to provide abundant energy without the meddling of environmental activists — America’s natural gas boom — faces a challenge of a very different sort: the environmentalists themselves. Innovative drilling techniques have made huge amounts of fuel deep below Americans’ feet retrievable at low cost. Most of it is methane, a greenhouse gas that produces only about half the carbon as coal after combustion. Environmentalists should be cheering: Cheap gas transported for the most part in existing pipelines can start the United States on a wealth-enhancing path with minimal added cost.

    That path will be followed naturally, based upon market forces and the ever-present consumer demand for energy. This might well eventually steer us away from fossil fuels, if only because they will gradually be depleted and so their price will by necessity rise.

    There is reason for hope – but not for complacency – over the coming years that the ill-conceived policy fantasies of climate alarmists can be fended off so that the poor of the world have a chance to prosper, with continuing access to our most abundant end least expensive energy sources – carbon-based fuels.

    This editorial represents the views of Dr. Roy W. Spencer as a professional climate scientist and semi-professional economist wannabe, as determined through debate among the various voices in his head.

    Ten Years After the Warming

    February 26th, 2012

    The version of global warming theory being pushed by the IPCC is that anthropogenic emissions of greenhouse gases are causing a radiative energy imbalance of the climate system, leading to warming.

    The radiative forcing history being used in the latest IPCC climate models looks something like the following, with red areas representing times when the climate system’s “stove is turned up”, that is, with heat accumulating in the system.

    (Actually, the correct analogy would be that the stove setting remains the same, but the lid partially covering the pot is covering it a little more over time…but that’s too hard to explain.)

    As can be seen, in the last 10 years the estimated forcing has been the strongest. Yet, most if not all temperature datasets show little or no global-average warming recently, either in the atmosphere, at the surface, or in the upper 700 meters of the ocean. For example, here are the tropospheric temperatures up though a few days ago:

    So what is happening? You cannot simply say a lack of warming in 10 years is not that unusual, and that there have been previous 10-year periods without warming, too. No, we are supposedly in uncharted territory with a maximum in radiative forcing of the climate system. One cannot compare on an equal basis the last 10 years with any previous decades without warming.

    There are 5 possibilities for the recent cessation of warming which are most discussed:

    1) cooling from anthropogenic aerosols has been cancelling out warming from more greenhouse gases

    2) natural cooling from internal climate fluctuations or the sun is cancelling out the GHG warming

    3) increased ocean mixing is causing the extra energy to be distributed into the deep ocean

    4) the temperature ‘sensitivity’ of the climate system is not as large as the IPCC assumes.

    5) there is something fundamentally wrong with the GHG warming theory itself

    Of course, some combination of the above 5 explanations is also possible.

    The 1st possibility (aerosol cooling is cancelling out GHG forcing) is one of the more popular explanations with the climate modelers, and especially with NASA’s James Hansen. The uncertain strength (and even sign) of aerosol forcing allows the climate modelers to use aerosols as a tuning knob (aka fudge factor) in making their models produce warming more-or-less consistent with past observations. Using an assumed large aerosol cooling to cancel out the GHG warming allows the modelers to retain high climate sensitivity, and thus the fear of strong future warming if those aerosols ever dissipate.

    The 2nd possibility (natural cooling) is a much less desirable explanation for the IPCC crowd because it opens the door to Mother Nature having as much or more influence on the climate system than do humans. We can’t have that, you know. Then you would have to consider the possibility that most of the warming in the last 50 years was natural, too. Goodbye, AGW funding.

    The 3rd possibility (increased ocean mixing) is one of the more legitimate possibilities, at least theoretically. It’s popular with NCAR’s Kevin Trenberth. But one would need more observational evidence this is happening before embracing the idea. Unfortunately, how vertical mixing in the ocean naturally varies over time is poorly understood; the different IPCC models have widely varying strengths of mixing, and so ocean mixing is a huge wild card in the global warming debate, as is aerosol cooling. I believe much of past climate change on time scales of decades to many centuries might be due to such variations in ocean mixing, along with their likely influence on global cloud cover changing the amount of solar input into the climate system.

    The 4th possibility (the climate system is relatively insensitive to forcing) is the top contender in the opinion of myself, Dick Lindzen, and a few other climate researchers who work in this field.

    The 5th possibility (increasing GHGs don’t really cause warming) is total anathema to the IPCC. Without GHG warming, the whole AGW movement collapses. This kind of scientific finding would normally be Nobel Prize territory…except that the Nobel Prize has become more of a socio-political award in recent years, with only politically correct recipients. The self-flagellating elites don’t like the idea humans might not be destroying the Earth.

    The longer we go without significant warming, the more obvious it will become that there is something seriously wrong with current AGW theory. I don’t think there is a certain number of years – 5, 10, 20, etc. – which will disprove the science of AGW….unless the climate system cools for the next 10 years. Eek! But I personally doubt that will happen.

    As long as strong warming does not resume, the heat-hiding-in-the-deep-ocean explanation will provide refuge for many years to come, and will be difficult to convincingly rule out as an explanation since it takes so long for the deep ocean to warm by even a tiny amount.

    Instead, there probably will be a tipping point (sooner than later) in popular perception when the public and Congress decide the jig is up, and they are no longer interested in hearing how we ‘might’ be headed for Armageddon.

    The public already knows how awful scientists are at forecasting the future…especially a future of doom, which curiously seems to be the only future scientists know how to predict.

    My Interview with WaPo Reporter Juliet Eilperin

    February 24th, 2012

    Yesterday I was called by Washington Post reporter Juliet Eilperin, who was doing a follow-up story on the Peter Gleick “Fakegate” release of Heartland Institute documents.

    Normally, I would not blog about talking to a reporter, but in my experience talking with a high-profile reporter like Juliet invariably leads to regret. Reporters will listen to my points, openly “agree” with me on many of them, and then take one poorly phrased snippet from a 20 minute conversation to make it appear I support their narrative.

    If this does not happen when Juliet’s article appears, the strain might be too much for me to bear. My world would be shaken.

    I’m just sayin’.

    So, I thought I would recount to the best of my recollection what we discussed. Then we will see how well her article represents my views. If she ignores them completely, well, that would be better than what usually happens.

    First, I pointed out that on the science of global warming I have been tagged a “lukewarmer”, sometimes attacked by both sides. I said that my position on the science of global warming causation is that I don’t think we know with any level of confidence how much of the warming in the last 50 years was due to humans versus nature.

    Next, she asked why (a relative few) scientists on both sides of the issue choose to enter the public fray over global warming, even though they know they will be attacked by the other side. I said I think the primary motivation is about the same on both sides: we know the science has policy implications, and those of us who take a stand do so based upon our understanding of how those policies might help (or hurt) people or nature.

    In the case of the outspoken IPCC-supporting ‘consensus’ scientists, they believe mankind is destroying nature and will cause the climate system to be increasingly hostile to life as we know it. (I did not venture into the quasi-religious motivation of some who believe humans should have zero impact on the natural world).

    In the case of skeptics like me, I believe the science supports a climate system which is not so fragile, and that premature energy policies which circumvent free market forces end up killing poor people.

    Next, Juliet seemed like she wanted a statement regarding the moral equivalence of the Climategate e-mail releases and Peter Gleick obtaining the confidential Heartland Institute documents. I said that if the Climategate e-mails were indeed obtained by a hacker, then both appeared to me to be illegal.

    But I *also* emphasized that what the Climategate e-mails revealed was behavior which was clearly crossing the line for scientists…such as collusion to interfere with the peer review process.

    In the case of the Heartland documents, in contrast, there was nothing there to see. Someone (Gleick?) had to forge an additional document to make it look like Heartland had nefarious motives.

    I said I have not publicly condemned Gleick’s unauthorized access to the Heartland documents because doing so would be somewhat hypocritical on my part, since I did not also condemn the Climategate e-mail releases. Instead, I choose to evaluate what the documents from both sources contain and what they say about the behavior of ‘activists’ on both sides of the global warming issue.

    On that basis, there is no moral equivalence.

    Climategate reveals taxpayer-funded scientists behaving badly, and possibly illegally. The Heartland documents reveal the mundane activities of a privately-funded organization trying to bring some balance to the coverage of climate change science.

    I don’t recall right now whether any other issues of significance were discussed in the interview, although there might have been.

    It will be interesting to see how Juliet represents my views. I predict spin.

    But if she accurately and fairly represents my views, I will apologize for doubting her.