Cap and Trade and the Illusion of the New Green Economy

July 1st, 2009

I don’t think Al Gore in his wildest dreams could have imagined how successful the “climate crisis” movement would become. It is probably safe to assume that this success is not so much the result of Gore’s charisma as it is humanity’s spiritual need to be involved in something transcendent – like saving the Earth.

After all, who wouldn’t want to Save the Earth? I certainly would. If I really believed that manmade global warming was a serious threat to life on Earth, I would be actively campaigning to ‘fix’ the problem.

But there are two practical problems with the theory of anthropogenic global warming: (1) global warming is (or at least was) likely to be a mostly natural process; and (2) even if global warming is manmade, it will be immensely difficult to avoid further warming without new energy technologies that do not currently exist.

On the first point, since the scientific evidence against global warming being anthropogenic is what most of the rest of this website is about, I won’t repeat it here. But on the second point…what if the alarmists are correct? What if humanity’s burning of fossil fuels really is causing global warming? What is the best path to follow to fix the problem?

Cap-and-Trade

The most popular solution today is carbon cap-and-trade legislation. The European Union has hands-on experience with cap-and-trade over the last couple of years, and it isn’t pretty. Over there it is called their Emissions Trading Scheme (ETS). Here in the U.S., the House of Representatives last Friday narrowly passed the Waxman-Markey bill. The Senate plans on taking up the bill as early as the fall of 2009.

Under cap-and-trade, the government institutes “caps” on how much carbon dioxide can be emitted, and then allows companies to “trade” carbon credits so that the market rewards those companies that find ways to produce less CO2. If a company ends up having more credits than they need, they can then sell those credits to other companies.

While it’s advertised as a “market-based” approach to pollution reduction, it really isn’t since the market did not freely choose cap-and-trade…it was imposed upon the market by the government. The ‘free market’ aspect of it just helps to reduce the economic damage done as a result of the government regulations.

The Free Market Makes Waxman-Markey Unnecessary

There are several serious problems with cap-and-trade. In the big picture, as Europe has found out, it will damage the economy. This is simply because there are as yet no large-scale, practical, and cost-competitive replacements for fossil fuels. As a result, if you punish fossil fuel use with either taxes or by capping how much energy is allowed to be used, you punish the economy.

Now, if you are under the illusion that cap-and-trade will result in the development of high-tech replacements for fossil fuels, you do not understand basic economics. No matter how badly you might want it, you can not legislate a time-travel machine into existence. Space-based solar power might sound really cool, but the cost of it would be astronomical (no pun intended), and it could only provide the tiniest fraction of our energy needs. Wind power goes away when the wind stops, and is only practical in windy parts of the country. Land-based solar power goes away when the sun sets, and is only practical in the sunny Southwest U.S. While I personally favor nuclear power, it takes forever to license and build a nuclear power plant, and it would take 1,000 1-gigawatt nuclear power plants to meet electricity demand in the United States.

And no one wants any of these facilities near where they live.

Fortunately, cap-and-trade legislation is not necessary anyway because incentives already exist – right now — for anyone to come up with alternative technologies for energy generation and energy efficiency. Taxpayers and consumers already pay for billions of dollars in both government research (through taxes) and private research (through the cost of goods and services) to develop new energy technologies.

Whoever succeeds in these efforts stands to make a lot of money simply because everything we do requires energy. And I do mean everything…even sitting there and thinking. Using your brain requires energy, which requires food, which requires fossil fuels to grow, distribute, refrigerate and cook that food.

Economic Competitiveness in the Global Marketplace

Secondly, when instituted unilaterally by a country, cap-and-trade legislation makes that country less competitive in the global economy. Imports and trade deficits increase as prices at home rise, while companies or whole industries close and move abroad to countries where they can be more competitive.

The Obama administration and congress are trying to minimize this problem by imposing tariffs on imports, but this then hurts everyone in all of the countries involved. Remember, two countries only willingly engage in trade with each other because it economically benefits both countries by reducing costs, thus raising the standard of living in those countries.

The Green Mafia

Third, cap-and-trade is a system that is just begging for cheating, bribing, and cooking the books. How will a company’s (or a farm’s) greenhouse gas emissions be gauged, and then monitored over time? A massive new bureaucracy will be required, with a litany of rules and procedures which have limited basis in science and previous experience.

And who will decide how many credits will initially be given by the government to each company/farm/industry? Does anyone expect that these decisions will be impartial, without political favoritism shown toward one company over another, or one industry over another? This is one reason why some high-profile corporations are now on the global warming bandwagon. They (or at least a few of their executives) are trying to position themselves more favorably in what they see to be an inevitable energy-rationed economic system.

Big Oil and Big Coal Will Not Pay for Cap-and-Trade

Fourth, it is the consumer – the citizen – who will pay for all of this, either in the form of higher prices, or reduced availability, or reduced economic growth. Companies have no choice but to pass increased costs on to consumers, and decreased profits to investors. You might think that “Big Business” will finally be paying their “fair share”, but Big Business is what provides jobs. No Big Business, no jobs.

The Green Jobs Illusion

Fifth, the allure of “green jobs” might be strong, but the economic benefit of those jobs is an illusion. The claim that many thousands of new green jobs will be created under such a system is probably true. But achieving low unemployment through government mandates does not create wealth – it destroys wealth.

Let me illustrate. We could have full employment with green jobs today if we wanted to. We could pay each other to dig holes in the ground and then fill the holes up again, day after day, month after month. (Of course, we’ll use shovels rather than backhoes to reduce fossil fuel use.) How’s that for a green jobs program?

My point is that it matters a LOT what kinds of jobs are created. Let’s say that today 1,000 jobs are required to create 1 gigawatt of coal-fired electricity. Now, suppose we require that electricity to come from a renewable source instead. If 5,000 jobs are needed to create the same amount of electricity with windmills that 1,000 jobs created with coal, then efficiency and wealth generation will be destroyed.

Sure, you can create as many green jobs as you want, but the comparative productivity of those jobs is what really matters. In the end, when the government manipulates the economy in such a fashion, the economy suffers.

And even if a market for green equipment (solar panels, windmills, etc.) does develop, there is little doubt that countries like China will be able to manufacture that equipment at lower cost than the United States. Especially considering all of our laws, regulations, limits, and restrictions.

So, What’s the Alternative?

If anthropogenic global warming does end up being a serious problem, then what can be done to move away from fossil fuels? I would say: Encourage economic growth, and burn fossil fuels like there is no tomorrow! Increased demand will lead to higher prices, and as long as the free market is allowed to work, new energy technologies will be developed.

As long a demand exists for energy (and it always will), there will be people who find ways to meet that demand. There is no need for silly awards for best inventions, etc., because the market value of those inventions will far exceed the value of any gimmicky, government-sponsored competitions.

Why are Politicians so Enamored by Cap-and-Trade?

Given the pain (and public backlash) the EU has experienced from two years’ experience with its Emissions Trading Scheme, why would our politicians ignore that foreign experience, as well as popular sentiment against cap-and-trade here at home, and run full-steam with eyes closed into this regulatory quagmire?

The only answer I can come up with is: more money and more power for government. As a former government employee, I am familiar with the mindset. While the goal of a private sector job is to create wealth, the government employee’s main job is to spend as much of that wealth as possible. A government agency’s foremost goal is self preservation, which means perpetuating a public need for the agency. The idea that our government exists to help enable a better life for its citizens might have been true 100 years ago, but today it is hopelessly naïve.

All Pain, No Gain

And finally, let’s remember what the whole purpose of carbon cap-and-trade is: to reduce future warming of the climate system. Even some prominent environmentalists are against Waxman-Markey because they do not believe it will substantially reduce carbon dioxide emissions here at home. To the extent that provisions are added to the bill to make it more palatable to politicians from agricultural states or industrial states, it then accomplishes even less of what it is intended to accomplish: reductions in carbon dioxide emissions.

And even if cap-and-trade does what is intended, the reduction in CO2 emissions as a fraction of global CO2 emissions will moderate future warming by, at most, around one tenth of a degree C by late in this century. That is probably not even measurable.

Of course, this whole discussion assumes that the climate system is very sensitive to our carbon dioxide emissions. But if the research we are doing is correct, then manmade global warming is being overestimated by about a factor of 5, and it is the climate system itself that causes climate change…not humans.

If that is the case, then nothing humanity does is going to substantially affect climate one way or the other. Indeed, given the fact that life on Earth depends upon the tiny amount of CO2 in the atmosphere, I continue to predict that more atmospheric CO2 will, in the end, be a good thing for life on Earth.

Yet, many politicians are so blinded by the additional political power and tax revenue that will come from a cap-and-trade system that they do not want to hear any good news from the science. For instance, in my most recent congressional testimony, the good news I presented was met with an ad hominem insult from Senator Barbara Boxer.

I can only conclude that some politicians actually want global warming to be a serious threat to humanity. I wonder why?


EPA Endangerment Finding: My Submitted Comments

June 23rd, 2009

1. ISSUE SUMMARY

1.1 Evidence that Climate Models Produce Far Too Much Warming

The issue I will address is appropriately introduced by a quote from a leading cloud expert, Dr. Robert Cess, made over ten years ago:

the [climate models] may be agreeing now simply because they’re all tending to do the same thing wrong. It’s not clear to me that we have clouds right by any stretch of the imagination.
– Dr. Robert Cess, quoted in Science (May 16, 1997, p. 1040)

Nowhere in climate models is there greater potential for error than in the treatment of clouds. This is especially true of low clouds, which cool the climate system, and which the IPCC has admitted are the largest source of uncertainty in global warming projections (IPCC, 2007).

Research published by us since the IPCC 2007 4th Assessment Report (IPCC AR4) suggests that a major problem exists with most, if not all, of the IPCC models’ cloud parameterizations. Cloud parameterizations are greatly simplified methods for creating clouds in climate models. Their simplicity is necessary since the processes controlling clouds are too complex to include in climate models, and yet those same parameterizations are critical to model projections of future global temperatures and climate since clouds determine how much sunlight is allowed into the climate system. Significantly, all 21 IPCC climate models today decrease global average cloud cover in response to any warming influence, such as that from anthropogenic carbon dioxide emissions, thus amplifying the small, direct warming effect of more CO2.

In stark contrast, though, new analyses of our latest and best NASA satellite data suggest that the real climate system behaves in exactly the opposite manner. This error, by itself, could mean that future warming projected by these models has been overstated by anywhere from a factor of 2 to 6.

How could such a serious error be made by so many climate experts? In a nutshell, when previous researchers have looked at how clouds and temperature have varied together in the real climate system, they have assumed that the observed temperature changes caused the observed cloud changes – but not the other way around.

As I will demonstrate, by assuming causation in only one direction they have biased their interpretation of cloud behavior in the direction of positive feedback (that is, high climate sensitivity). The existence of the problem was first published by us 1 November, 2008 in Journal of Climate (Spencer and Braswell, 2008). It supported our previously published research which showed that weather systems in the tropics also behave in the opposite manner as do climate models (Spencer et al., 2007), that is, they reduce any warming influence rather than magnify it.

My claim on the direction of causation is more recently supported by evidence that one can distinguish cause from effect under certain conditions, in both satellite observations of the real climate system, and in climate models themselves. The result is that true negative feedback in the climate system has been obscured by the dominating influence of natural cloud changes causing temperature changes, which has produced the illusion of a sensitive climate system.

1.2 Natural Cloud Variations Might Have Caused “Global Warming”

If feedbacks in the climate system are indeed negative rather than positive, not only does this mean anthropogenic global warming might well be lost in the noise of natural climate variability, it also means that CO2 emissions have been insufficient to cause most of the warming seen in the past 50 years or so. Just as researchers have been misled about climate sensitivity by ignoring clouds-causing-temperature change, they have also neglected natural cloud variations as a potential source of climate change itself.

While the IPCC claims they can only explain late 20th Century warming when they include anthropogenic CO2 emissions in the models, this is mostly because sufficiently accurate long-term global cloud observations simply do not exist with which one might look for natural sources of climate change. A persistent change of only 1% or 2% in global-average low cloud cover would be sufficient to cause global warming – or cooling. Our ability to measure long-term cloud changes to this level of precision has existed only since the launch of NASA’s Terra satellite in 2000, and so it is not possible to directly determine whether there are natural cloud changes that might have caused most of the climate variability seen over the last 50 to 100 years.

But even though such long-term observations do not exist, one could instead study known natural climate indices such as the Pacific Decadal Oscillation (PDO, Mantua et al., 1997) during that 10 year period to look for evidence that known natural modes of climate variability modulate global average cloud cover. This kind of research should be required before one even begins to discuss ruling out natural climate variability as a source of climate change. Unfortunately, this type of research has never been performed by anyone.

The failure to sufficiently investigate natural, internal modes of climate variability by the climate research and modeling community is a major failing of the IPCC and the CCSP processes. Under the Federal Information Quality Act, the U.S. science process must be held to a higher standard of objectivity and utility.

The discussion below demonstrates why EPA cannot use either the IPCC or CCSP conclusions as a basis for its Endangerment Finding since both depend on the same flawed models.

2. SPECIFIC ERRORS IN THE EF/TSD

I will be addressing the following endangerment finding (EF) and technical support document (TSD) statements, specifically challenging the portions in bold italics, below.

Endangerment Finding:

EF-18896.2: Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations. Global observed temperatures over the last century can be reproduced only when model simulations include both natural and anthropogenic forcings, that is, simulations that remove anthropogenic forcings are unable to reproduce observed temperature changes. Thus, most of the warming cannot be explained by natural variability, such as variations in solar activity.

TSD Executive Summary:

Observed Effects Associated with Global Elevated Concentrations of GHGs:

[OE 2] The global average net effect of the increase in atmospheric GHG concentrations, plus other human activities (e.g., land use change and aerosol emissions), on the global energy balance since 1750 has been one of warming. This total net heating effect, referred to as forcing, is estimated to be +1.6 (+0.6 to +2.4) Watts per square meter (W/m2), with much of the range surrounding this estimate due to uncertainties about the cooling and warming effects of aerosols. The combined radiative forcing due to the cumulative (i.e., 1750 to 2005) increase in atmospheric concentrations of CO2, CH4, and N2O is estimated to be +2.30 (+2.07 to +2.53) W/m2. The rate of increase in positive radiative forcing due to these three GHGs during the industrial era is very likely to have been unprecedented in more than 10,000 years.

[OE 4] Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations. Climate model simulations suggest natural forcing alone (e.g., changes in solar irradiance) cannot explain the observed warming.

Projections of Future Climate Change with Continued Increases in Elevated GHG Concentrations:

[PF 2] Future warming over the course of the 21st century, even under scenarios of low emissions growth, is very likely to be greater than observed warming over the past century.

[PF 3] All of the U.S. is very likely to warm during this century, and most areas of the U.S. are expected to warm by more than the global average.

2.1 Comments

What follows is evidence against the familiar IPCC AR4 claim, which also appears in both the EF and TSD:

Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.

As we will see, this claim is premature at best. I begin with a discussion of feedbacks (which determine climate sensitivity) because the IPCC’s belief in a sensitive climate system is central to their claim that global warming is mostly anthropogenic, and not natural.

2.1.1 The Importance of Climate Sensitivity to Demonstrating Causation in Global Warming

The central issue of causation in global warming is closely related to ‘climate sensitivity’, which can be defined as the amount of warming the Earth experiences in response to a radiative forcing (global average imbalance in sunlight gained versus thermally emitted infrared radiation lost).

If climate sensitivity is relatively high, as the IPCC claims, then I agree that anthropogenic greenhouse gas emissions might well be the main reason for most of the warming experienced in the last 50 years. This is because the small amount of radiative forcing from anthropogenic greenhouse gases would be sufficient to explain past warming if that small warming is amplified by positive feedback, which is what produces high climate sensitivity.

But if climate sensitivity is low, then anthropogenic emissions would be too weak to cause substantial warming. Some stronger, natural mechanism would need to be mostly responsible for the warming we have experienced. Low climate sensitivity would additionally mean that any source of anthropogenic forcings (greenhouse gases, aerosols, etc.) would not substantially affect climate, and therefore a reduction in emissions would have little effect on climate.

This is partly why the IPCC is not motivated to find natural sources of climate change. If the climate system is quite sensitive, then the extra carbon dioxide in the atmosphere alone is sufficient to explain past warming.

So, what determines feedbacks, and thus climate sensitivity? Simply put, climate sensitivity depends upon whether clouds (and other elements of the climate system) respond to the small amount of warming caused by the 1+ W/m2 radiative forcing from anthropogenic greenhouse gases by either amplifying it through ‘positive feedbacks’, or by reducing it through ‘negative feedbacks’. Depending upon climate sensitivity, the long-term warming from increasing atmospheric greenhouse gas concentrations could theoretically be anywhere from unmeasurable to catastrophic. Climate sensitivity thus becomes the main determinant of the level of future anthropogenic global warming, and how it then compares in magnitude to natural sources of climate variability.

The IPCC has admitted that feedbacks from low clouds (which have a large impact on how much sunlight reaches the Earth’s surface) are the most uncertain of all the feedbacks that determine climate sensitivity, and therefore constitute the largest source of uncertainty in projections of future global warming (IPCC, 2007). Indeed, Trenberth & Fasullo (2009) found that most of the differences between the IPCC climate models’ warming projections can be traced to how they change low and middle cloud cover with warming. Recent work by Caldwell and Bretherton (2009) with a sophisticated cloud resolving model has produced results supporting negative, not positive, low cloud feedback.

I now believe that the low cloud feedback issue is even more serious than the IPCC has admitted. Next we will examine why I believe there has been so much uncertainty over cloud feedback in the climate system.

2.1.2 Why Previous Observational Estimates of Climate Sensitivity have been Inconclusive

Previous estimates of climate sensitivity from observational data have led to confusing and inconclusive results (Knutti and Hergerl, 2008). The most recently published satellite estimates of climate sensitivity (Forster and Gregory, 2006, hereafter FG06) and IPCC AR4 climate model sensitivity (Forster and Taylor 2006, hereafter FT06) led FT06 to conclude that the satellite-based results could not be trusted. At face value, their best estimate of sensitivity from satellite observations was less sensitive then all of the IPCC models, but they concluded that the uncertainty in the satellite estimate was so large that it was of little value anyway.

But we have determined that the reason researchers have been unable to pin down a reasonable estimate of climate sensitivity is because cause and effect have not been accounted for when measuring the co-variations between cloud cover (or radiative flux) and temperature. We have evidence that the true signature of feedback in the data has been mostly obscured by natural cloud variations forcing temperature variations.

The problem can be illustrated with the following example: If global average cloudiness is observed to decrease in warmer years, this would normally be interpreted as warming causing a cloud decrease, which would be positive feedback, which would mean higher climate sensitivity. But what if the warming was the result of the decrease in cloud coverage, rather than the cause of it?

As demonstrated by Spencer and Braswell (2008) with a simple model of global-average climate, the presence of something as simple as daily random variations in clouds will cause feedbacks diagnosed from measurements of temperature and cloudiness (actually, the radiative imbalance of the Earth) to be biased in the direction of positive feedback (high climate sensitivity). That paper was reviewed by two IPCC experts: Piers Forster, and Isaac Held, who both agreed our paper raised a valid and potentially important issue.

What follows is quantitative evidence that suggests why this mix-up between cause and effect causes the illusion of high climate sensitivity. (The paper describing this evidence is under revision to be resubmitted to Journal of Geophysical Research after we address questions raised by the three reviewers of the paper). The causation issue can be illustrated with the following graph of 7.5 years of recent satellite-measured global ocean average variations in tropospheric temperature versus top-of-atmosphere total radiative flux (‘LW’ is emitted infrared, ‘SW’ is reflected sunlight). Both of these datasets are publicly available.

epa_fig01
Fig. 1. Global oceanic 3-month averages of net radiative flux variations (reflected sunlight + thermally emitted infrared) from the CERES radiation budget instrument flying on NASA’s Terra satellite, plotted against corresponding tropospheric temperature variations estimated from channel 5 of the Advanced Microwave Sounding Unit (AMSU) flying on the NOAA-15 satellite.

The slope of the dashed line, fit to the data with statistical ‘regression’, would traditionally be assumed to provide an estimate of feedback, and therefore of climate sensitivity. Regression line slopes are what FG06 used to diagnose feedbacks in satellite data, and what FT06 used to diagnose feedbacks in climate models. The greater the slope of the regression line, the less sensitive the climate system; the shallower the slope of the line, the greater the climate sensitivity. Since a line slope of 3.3 Watts per sq. meter per degree C represents the border between positive and negative feedback, the slope of the line in Fig. 1 (1.9 Watts per sq. meter per deg. C) would, at face value, correspond to moderate positive feedback.

But note the huge amount of scatter in the data. This is an example of why researchers have not trusted previous satellite estimates of feedback. If the data points happened to cluster nicely along a line, then we would have more confidence in the diagnosed feedback. But instead we see data scattered all over. This scatter translates directly into uncertainty in the slope of the line, which means uncertainty in feedback, which means uncertainty in climate sensitivity.

This is the point at which other researchers have stopped in their analysis of the satellite data, concluding that satellite estimates of feedbacks are too uncertain to be of much use. But we have determined that researchers have not dug deep enough in their data analysis. It turns out that the scatter in the data is mostly the result of cloud variations causing temperature variations, which is causation in the opposite direction as feedback. If the data in Fig. 1 are plotted as running averages, rather than independent averages, and the successive data points in time are connected by lines, certain patterns begin to emerge, as is shown in Fig. 2.

epa_fig02
Fig. 2. As in Fig 1, but now 3-month averages are computed every day and connected by lines, revealing the time history of how the climate system evolves.

This method of plotting is called phase space analysis, and it can “easily elucidate qualities of (a) system that might not be obvious otherwise” (Wikipedia.com entry on “Phase Space”). What we now see instead of a seemingly random scatter of points is a series of linear striations and looping or spiraling patterns.

A very simple forcing-feedback model widely used in climate studies (e.g. Spencer and Braswell, 2008) can be used to show that the linear features are temperature changes causing cloud-induced radiative changes (that is, feedback); while the looping features are from causation in the opposite direction: cloud variations causing temperature variations.

Significantly, it is the natural cloud variations causing temperature variations that de-correlates the data, leading to a regression line slope biased in the direction of high climate sensitivity (positive feedback) like that seen in Fig. 1.

We also find the spiral features and linear features in IPCC climate models tracked by the IPCC, for instance in the GFDL CM2.1 model shown in Fig. 3.

epa_fig03
Fig. 3. As in Fig. 2, but for yearly global averages plotted every month from the GFDL CM2.1 climate model. The dashed line is a regression fit to the data; the slope of the solid line represents the model’s long-term feedback in response to anthropogenic radiative forcing from greenhouse gases as diagnosed by FT06.

It is important to note that the linear striations in Fig. 3 are approximately parallel to the ‘true’ long-term feedback as diagnosed for this model by FT06, which is indicated by the solid line. This means that the slope of the short-term linear striations are indeed an indication of the long-term feedback in the model, and therefore of the climate sensitivity of that model. We find obvious striations (‘feedback stripes’) in five of the IPCC climate models, and in all five cases their average slope is very close to the long-term feedback in those models diagnosed by FT06.

While the above analysis might seem a little technical, it is merely a way to quantitatively demonstrate how a mix-up between cause and effect between clouds and temperature can lead to the illusion of a sensitive climate system.

2.1.3 Feedbacks Revealed In Satellite Data

Using this new insight, if we now return to the linear striations seen in the satellite data plotted in Fig. 2, we find their slope to be about 6 Watts per sq. meter per degree C, which would correspond to strongly negative feedback. This is about the same feedback value that Spencer et al. (2007) found for a composite of tropical weather systems over a multi-year period. If this is the feedback operating in the real climate system on the long time scales involved with manmade global warming, then the amount of warming from a doubling of carbon dioxide would only be about 0.6 deg. C, which is at least a factor of 4 less than the IPCC’s best estimate for the future. This casts serious doubt upon the projections of future climate change mentioned in the EF and TSD:

[PF 2] Future warming over the course of the 21stCentury, even under scenarios of low emissions growth, is very likely to be greater than observed warming over the past century.

Similarly, all projections of substantial future regional climate change, such as:

[PF 3] All of the U.S. is very likely to warm during this century, and most areas of the U.S. are expected to warm by more than the global average.

also depend upon high climate sensitivity, and so are similarly called into question.

As far as I know, we are the only research group performing this kind of research. I believe that much greater exploitation our satellite data resources is required to understand what the climate system is trying to tell us about climate sensitivity before we can place any level of confidence in the climate model projections relied upon by the IPCC, and thus by the EPA. The above evidence suggests that previous tests of climate models with observational data have not been sufficiently detailed to validate the feedbacks (climate sensitivity) in those models. At a minimum, the models need to be adjusted to mimic the behavior seen in the real climate system, such as that described above, with methods (e.g. phase space analysis) that can reveal the separate signatures of temperature-forcing-clouds (feedback) from clouds-forcing-temperature (internal radiative forcing).

This is an important point, and it is worth repeating: none of the previous comparisons of climate model output to satellite data have been sufficient to test the climate sensitivity of those models. Unless one accounts in some way for the direction of causation when comparing temperature variations to cloud (or radiative flux) variations, any observational estimates of feedback are likely to be spuriously biased in the direction of high climate sensitivity.

2.1.4 The Potential Role of Natural Cloud variations in Causing Climate Change

The existence of negative feedback in the climate system would have two important consequences: (1) future anthropogenic climate change can be expected to be small, possibly even unmeasurable in the face of natural climate variability; and (2) increasing greenhouse gas concentrations are insufficient to have caused past warming of the climate system. One or more natural sources of climate change would need to be involved.

And this is where natural cloud variations once again enter the picture. While the IPCC only mentions “external” forcings (radiative imbalances) on the climate system due to volcanoes, anthropogenic pollution, and output from the sun, it is also possible for “internal” forcings such as natural cloud changes to cause climate change. This could come about simply through small natural changes in the general circulation of the ocean and atmosphere, for instance from the Pacific Decadal Oscillation (Mantua et al., 1997), or other known (or even unknown) modes of natural climate variability. Spencer and Braswell (2008) showed that even daily random cloud variations over the ocean can lead to substantial decadal time scale variability in ocean temperatures.

It is critical to understand that the IPCC assumes, either explicitly or implicitly, that such natural cloud variations do not occur on the time scales involved in climate change. Quoting EF-18896.2,

Global observed temperatures over the last century can be reproduced only when model simulations include both natural and anthropogenic forcings, that is, simulations that remove anthropogenic forcings are unable to reproduce observed temperature changes. Thus, most of the warming cannot be explained by natural variability, such as variations in solar activity.

This statement misleadingly implies that the IPCC knows all of the important natural sources of climate change – but they do not. The IPCC specifically ignores any potential internally-generated sources of climate change…what the public would simply call “natural cycles” in climate. While they do not mention it, the IPCC can do this because we do not have global measurements over a long enough period of time to measure small changes in global average cloud cover. All it would take is 1% or 2% fluctuations to cause the kinds of variability in global temperatures exhibited by the following plot (Fig. 4) of global temperature proxy measurements over the last 2,000 years (Loehle, 2007).

epa_fig041
Fig. 4. Non-treering proxy reconstruction of global temperature variations over the past 2,000 years (Loehle, 2007).

In fact, if this reconstruction of past temperature variations is anywhere close to being realistic it suggests there is no such thing as “average climate”. As can be seen, substantial temperature changes on 50 to 100 year time scales such as what was observed over the 20th Century have been the rule, not the exception. Periods of steady temperatures are actually quite unusual.

The cause of such natural changes is still unknown to science. Some will argue sunspot activity has modulated global cloud amounts, which is one possibility for external forcing. But another possibility, as alluded to above, is that the climate system causes its own climate change. For instance, we know that “chaos” happens in weather (e.g. Lorenz, 1963), the result of complex nonlinear interactions within, and between, weather systems. But given the much longer (e.g. decadal to centennial) timescales involved in the ocean circulation, there is no reason why chaotic variations can not also occur in climate (e.g. Tsonis et al., 2007).

Therefore, the claim by the IPCC that warming can only be produced by climate models when anthropogenic greenhouse gases are included misleadingly implies that climate does not change naturally. But this is merely an assumption enabled by a lack of data to demonstrate otherwise – not an inference from analysis of existing data.

And again, the easiest way for such changes to occur would be for small changes in atmospheric and oceanic circulation systems to cause small changes in global average cloud cover. For instance, if a small decrease in cloud cover occurs over the ocean, then the ocean will respond by warming. This will, in turn, cause warming and humidifying of oceanic air masses. These warmer and moister air masses then flow across the continents, where even greater warming can result from the natural greenhouse effect of more water vapor contained in the air masses. This was recently demonstrated by Compo and Sardeshmukh (2009), and it demonstrates that “global warming” can indeed be generated naturally. Because of this, there probably is no reliable ‘fingerprint’ of anthropogenic climate change.

Therefore, the TSD statement that “The rate of increase in positive radiative forcing due to these three GHGs during the industrial era is very likely to have been unprecedented in more than 10,000 years”, is nothing more than a statement of faith, and completely ignores the possibility that there have been much larger natural changes in greenhouse gases in the past — specifically, in water vapor…the Earth’s main greenhouse gas.

Again, past changes in atmospheric temperature and water vapor – and thus in the Earth’s greenhouse effect – can be caused by natural changes in oceanic cloud cover modulating the amount of sunlight that is absorbed by the ocean. This possibility is ignored by the IPCC, who simply assume that the climate system was in a perpetual state of radiative energy balance until humans came along and upset that balance.

3. FINAL REMARKS

As we have seen, there is considerable evidence – both theoretical and observational — to distrust the IPCC’s claim that global warming is mostly anthropogenic in origin. The work described above — some published in the peer reviewed scientific literature by us and by others, some in the process of being published — strongly suggests the IPCC AR4 climate models produce too much global warming, possibly by a wide margin.

As discussed above, the most important reason why climate models are too sensitive is, in my view, due to the neglect of the effect that natural cloud changes in the climate system have on (1) climate sensitivity estimates, and on (2) past climate change, such as the global-average warming which occurred in the 20th Century.

The failure to investigate all natural processes by the climate research and modeling communities is a major failing of the IPCC and the CCSP processes. Under the Federal Information Quality Act, the U.S. science process must be held to higher standards of objectivity and utility. Clearly, there is as yet little scientific basis for making an “Endangerment Finding” related to carbon dioxide.


REFERENCES CITED

Caldwell, P., and C. S. Bretherton, 2009. Response of a subtropical stratocumulus-capped mixed layer to climate and aerosol changes. Journal of Climate, 22, 20-38.

Compo, G.P., and P. D. Sardeshmukh, 2009. Oceanic influences on recent continental warming, Climate Dynamics, 32, 333-342.

Forster, P. M., and J. M. Gregory, 2006. The climate sensitivity and its components diagnosed from Earth Radiation Budget data, J. Climate, 19, 39-52.

Forster, P.M., and K.E. Taylor, 2006. Climate forcings and climate sensitivities diagnosed from coupled climate model integrations, J. Climate, 19, 6181-6194.

Intergovernmental Panel on Climate Change, 2007. Climate Change 2007: The Physical Science Basis, report, 996 pp., Cambridge University Press, New York City.

Knutti, R., and G. C. Hegerl, 2008. The equilibrium sensitivity of the Earth’s temperature to radiation changes,” Nature Geoscience, 1, 735-743.

Loehle, 2007. A 2,000 year global temperature reconstruction on non-treering proxy data. Energy & Environment, 18, 1049-1058.

Lorenz, E.N., 1963: Deterministic non-periodic flow, Journal of the Atmospheric Sciences, 20, 130–141.

Mantua, N. J., S.R. Hare, Y. Zhang, J.M. Wallace, and R.C. Francis, 1997: A Pacific interdecadal climate oscillation with impacts on salmon production. Bulletin of the American Meteorological Society, 78, 1069-1079.

Spencer, R.W., W. D. Braswell, J. R. Christy, and J. Hnilo, 2007. Cloud and radiation budget changes associated with tropical intraseasonal oscillations, Geophys. Res. Lett., 34, L15707 doi:10.1029/2007GL029698.

Spencer, R.W., and W.D. Braswell, 2008. Potential biases in cloud feedback diagnosis: A simple model demonstration, J. Climate, 21, 5624-5628.

Trenberth, K.E.., and J.T. Fasullo, 2009. Global warming due to increasing absorbed solar radiation. Geophysical Research Letters, 36, L07706, doi:10.1029/2009GL037527.

Tsonis, A. A., K. Swanson, and S. Kravtsov, 2007. A new dynamical mechanism for major climate shifts. Geophysical Research Letters, 34, L13705, doi:10.1029/2007GL030288.

Ice Ages or 20th Century Warming, It All Comes Down to Causation

June 17th, 2009

Musings on the Vostok Ice Core Record

(edited 11 July 2009 for clarity in discussion of lead-lag relationship between Vostok temperature and CO2)

Since I get asked so often what I think of the Vostok ice core data that James Hansen uses as ‘proof’ that CO2 drives temperature, I decided to spend a few days analyzing that data, as well as the Milankovitch forcings calculated by Huybers and Denton (2008) which (allegedly) helps drive the global temperature variations seen in the Vostok Data.

The following graph shows most of the 400,000 year Vostok record of retrieved temperature and CO2 content. Assuming that these estimates really are what they are claimed to be, what can they tell us — if anything — about the role of carbon dioxide in climate change?

vostok-co2-and-temperature

First off you need to realize there has been a long history of debate in the climate community about the 3 Milankovitch cycles in solar forcing being the driving force for the Ice Ages. As far as I can discern, there have been at least three main problems with the Milankovitch theory.

First, the huge Vostok cycle at about 100,000 years remain unexplained because the Milankovitch cycles show no amplification for those events, just a regular series of small forcings which tend to be correlated with the smaller bumps on the curves of Vostok temperature and CO2 in the above graph.

A second problem has been that the positive correlation with Vostok (at the South Pole) came from NORTHERN Hemispheric forcing, not from the Southern Hemisphere. In the south, the Milankovitch forcings were out of phase with the temperature response in the Vostok ice core record. This has presented the problem of how Northern Hemispheric changes in solar forcing could cause such huge changes in Antarctic climate.

The third problem is that the Milankovitch forcings are so small it was difficult to see how they could cause even the smaller temperature changes in the ice core record – UNLESS climate sensitivity is very high (that is, feedbacks are strongly positive). This appears to be James Hansen’s view: the small Milankovitch forcings caused small increases in temperature, which in turn caused increases in carbon dioxide, which then took over as the forcing mechanism.

The recent paper by Huybers and Denton claims to have alleviated the 2nd and 3rd problems. If one assumes it is the length of summer in the Southern Hemisphere (rather than average yearly solar intensity) which is the main driving force for changing the Antarctic ice sheet, and also accounts for the fact that it takes many thousands of years for the ice sheet (and thus the temperature) to change in response to that summertime forcing, then the Southern Hemisphere Milankovitch cycles really do line up pretty well in time with the smaller Vostok events (but still not the huge 100,000 year events), and the “forcing” involved also becomes larger.

The Role of CO2 in the Vostok Record

So, where does CO2 fit into all of this? Well, at face value Hansen’s theory does require that temperature drives CO2…at least to begin with. Hansen claims the Milankovitch cycle in Earth-sun geometry caused a small temperature change, which then led to a CO2 change, and then the CO2 took over as the forcing. Or, maybe you could say the CO2 provided a strong positive feedback mechanism on temperature.

But it has often been noted that the Vostok CO2 record lags the temperature record by an average of 800 years, which is somewhat of a problem for Hansen’s theory. After all, if CO2 “took over” as the main driver of temperature, why do the CO2 changes tend to come after the temperature changes? But then there have also been uncertainties in dating the CO2 record due to assumptions that have to be made about how far and how fast the CO2 migrates through the ice core, giving the appearance of a different age for the CO2. So the lead-lag relationship still has some uncertainty associated with it.

But even if the CO2 and the temperature record lined up perfectly, would that mean Hansen is correct? No. It all hinges on the assumption that there was no forcing other than CO2 that caused the temperature changes. Hansen’s theory still requires that the temperature variations cause the CO2 changes to begin with, which begs the question: What started the temperature change? And what if the mechanism that started the temperature change is actually responsible for most of the temperature change? In that case, the climate sensitivity inferred from the co-variations between temperature and CO2 becomes less. The fact that even the recent work of Huybers and Denton does not address the major question of what caused the huge 100,000 year cycle in the Vostok data suggests there might be a forcing mechanism that we still don’t know about.

If Hansen is correct, and CO2 is the main forcing in the Vostok record, then it takes only about 10 ppm increase in CO2 to cause 1 degree C temperature change. The full range of CO2 forcing in the Vostok record amounts to 1.6 to 2 Watts per sq. meter, and if that caused the full range of temperature variations, then today we still have as much as 10 deg. C of warming “in the pipeline” from the CO2 we have put in the atmosphere from fossil fuel burning. That’s because 1.6 Watts per sq. meter is about the same amount of manmade forcing that supposedly exists in today’s atmosphere.

But if some other forcing is responsible for the temperature change, that necessarily implies lower climate sensitivity. And the greater the unknown forcing involved, the smaller the climate sensitivity you will calculate.

The Central Question of Causation

I believe that the interpretation of the Vostok ice core record of temperature and CO2 variations has the same problem that the interpretation of warming and CO2 increase in the last century has: CAUSATION. In both cases, Hansen’s (and others’) inference of high climate sensitivity (which would translate into lots of future manmade warming) depends critically on there not being another mechanism causing most of the temperature variations. If most of the warming in the last 100 years was due to CO2, then that (arguably) implies a moderately sensitive climate. If it caused the temperature variations in the ice core record, it implies a catastrophically sensitive climate.

But the implicit assumption that science knows what the forcings were of past climate change even 50 years ago, let alone 100,000 years ago, strikes me as hubris. In contrast to the “consensus view” of the IPCC that only “external” forcing events like volcanoes, changes in solar output, and human pollution can cause climate change, forcing of temperature change can also be generated internally. I believe this largely explains what we have seen for climate variability on all time scales. A change in atmospheric and oceanic circulation patterns could easily accomplish this with a small change in low cloud cover over the ocean. In simple terms, global warming might well be mostly the result of a natural cycle.

The IPCC simply assumes this internally-generated forcing of climate never occurs. If they did, they would have to admit they have no clue how much warming in the last 50 years is natural versus anthropogenic.

Fears of substantial manmade warming are critically dependent upon the assumption that the climate system never changes all by itself, naturally, or that there were not other external forcing mechanisms at work we are not aware of. Given the complex nonlinear behavior of the climate system, it seems to me that the assumption that things like global average low cloud cover always stays the same is unwarranted. In the end, in scientific research it’s the scientific assumptions that come back to bite you.

Where Does this Leave Carbon Dioxide?

About the only thing that seems like a safe inference from the Vostok record is that temperature drives CO2 variations – even Hansen’s theory requires that much — but the view that CO2 then caused most of the temperature variations seems exceedingly speculative.

And, if you are thinking of using the Vostok record to support warming driving today’s CO2 increase, you can forget it. In the Vostok record it amounts to only about 8 to 10 ppm per degree C, whereas our ~1 deg. C warming in the last 100 years has been accompanied by about 10 times that much CO2 increase.

The bottom line is that fears of substantial manmade climate change are ultimately based upon the assumption that we know what caused past climate change. For if past climate changes were caused by tiny forcings (too tiny for us to know about), then the climate system is very sensitive. Of course, those forcings might have been quite large – even self-imposed by the climate system itself — and we still might not know about them.

Finally, as a side note, it is interesting that the modern belief that our carbon emissions have caused the climate system to rebel are not that different from ancient civilizations that made sacrifices to the gods of nature in their attempts to get nature to cooperate. Technology might have changed over time, but it seems human nature has remained the same.


IPCC Admits they Could be Wrong about Humans Causing Global Warming

June 13th, 2009

If that headline surprises you, then it is a good indication of how successful the United Nations Intergovernmental Panel on Climate Change (IPCC) has been in their 20-year long effort to pin the rap for global warming on humanity. I’m not reporting anything really new here. I’m just stating what is logically consistent with, and a necessary inference from, one of the most recognizable claims contained in the Summary for Policymakers in the IPCC’s 2007 report:

Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.

They assign a probablility of 90% to the term “very likely”. This is a curious use of statistical probability since warming over the last 50 years is either mostly due to anthropogenic emissions, or it isn’t. Probabilities do not apply to past events like this.

What the IPCC is really using probabilities for is to attach some scientific value to their level of faith. I would be very surprised if there weren’t vigorous objections to the use of probabilities in this way from members of the IPCC…but the IPCC leadership really needed a scientific-sounding way to help push their political agenda, so I’m sure any objections were overruled.

But I digress. My main point here is that the IPCC is admitting that they might be wrong. That doesn’t sound to me like “the science is settled”, as Al Gore is fond of saying. If it is “very likely” that “most” of the observed warming was due to mankind, then they are admitting that it is possible that the warming was mostly natural, instead.

So, let’s play along with their little probability game. Given the extreme cost of greatly reducing our greenhouse gas emissions, wouldn’t you say that it would be important to actively investigate the 10% possibility that warming is mostly natural, as the IPCC readily admits?

Where is the 10% of government research dollars looking into this possibility? I suspect it is going to environmental NGO’s who are finding new ways to package “global warming” so that it doesn’t sound like a liberal issue. Or maybe they are working on more clever names to call researchers like me other than “deniers”, which is getting a little tiresome.

For many years the Department of Defense has had “Red Team” reviews devoted to finding holes in the “consensus of opinion” on weapons systems that cost a whole lot less than punishing the use of our most abundant and affordable sources of energy. It seems like a no-brainer that you would do something similar for something this expensive, and as destructive to the lives of poor people all over the world.

One could almost get the impression that there is more than just science that determines what climate science gets funded, and how it gets reported by the news media. Oh, that’s right I forgot…the United Nations is in charge of this effort. Well, I’m sure they know what they are doing.


May 2009 Global Temperature Update +0.04 deg. C

June 4th, 2009

YR MON GLOBE   NH   SH   TROPICS
2009   1   0.304   0.443   0.165   -0.036
2009   2   0.347   0.678   0.016   0.051
2009   3   0.206   0.310   0.103   -0.149
2009   4   0.090   0.124   0.056   -0.014
2009   5   0.043   0.043   0.043   -0.168

1979-2009 Graph

May 2009 saw another drop in the global average temperature anomaly, from +0.09 deg. C in April to +0.04 deg. C in May, originating mostly from the Northern Hemisphere and the tropics.

A reminder for those who are monitoring the daily progress of global-average temperatures here:

(1) Only use channel 5 (“ch05”), which is what we use for the lower troposphere and middle troposphere temperature products.
(2) Compare the current month to the same calendar month from the previous year (which is already plotted for you).
(3) The progress of daily temperatures should only be used as a rough guide for how the current month is shaping up because they come from the AMSU instrument on the NOAA-15 satellite, which has a substantial diurnal drift in the local time of the orbit. Our ‘official’ results presented above, in contrast, are from AMSU on NASA’s Aqua satellite, which carries extra fuel to keep it in a stable orbit. Therefore, there is no diurnal drift adjustment needed in our official product.


A Layman’s Explanation of Why Global Warming Predictions by Climate Models are Wrong

May 29th, 2009

(last edited 8:30 a.m. 1 June 2009 for clarity)

23-ipcc-climate-models1
I occasionally hear the complaint that some of what I write is too technical to understand, which I’m sure is true. The climate system is complex, and discussing the scientific issues associated with global warming (aka “climate change”) can get pretty technical pretty fast.

Fortunately, the most serious problem the climate models have (in my view) is one which is easily understood by the public. So, I’m going to make yet another attempt at explaining why the computerized climate models tracked by the U.N.’s Intergovernmental Panel on Climate Change (IPCC) – all 23 of them – predict too much warming for our future. The basic problem I am talking about has been peer reviewed and published by us, and so cannot be dismissed lightly.

But this time I will use no graphs (!), and I will use only a single number (!!) which I promise will be a small one. 😉

I will do this in three steps. First, I will use the example of a pot of water on the stove to demonstrate why the temperature of things (like the Earth) rises or falls.

Secondly, I will describe why so many climate model “experts” believe that adding CO2 to the atmosphere will cause the climate system to warm by a large, possibly catastrophic amount.

Finally, I will show how Mother Nature has fooled those climate experts into programming climate models to behave incorrectly.

Some of this material can be found scattered through other web pages of mine, but here I have tried to create a logical progression of the most important concepts, and minimized the technical details. It might be edited over time as questions arise and I find better ways of phrasing things.

The Earth’s Climate System Compared to a Pot of Water on the Stove

Before we discuss what can alter the global-average temperature, let’s start with the simple example of a pot of water placed on a stove. Imagine it’s a gas stove, and the flame is set on its lowest setting, so the water will become warm but will not boil. To begin with, the pot does not have a lid.
2-pots-on-stove
Obviously, the heat from the flame will warm the water and the pot, but after about 10 minutes the temperature will stop rising. The pot stops warming when it reaches a point of equilibrium where the rate of heat loss by the pot to its cooler surroundings equals the rate of heat gained from the stove. The pot warmed as long as an imbalance in those two flows of energy existed, but once the magnitude of heat loss from the hot pot reached the same magnitude as the heat gain from the stove, the temperature stopped changing.

Now let’s imagine we turn the flame up slightly. This will result in a temporary imbalance once again between the rate of energy gain and energy loss, which will then cause the pot to warm still further. As the pot warms, it loses energy even more rapidly to its surroundings. Finally, a new, higher temperature is reached where the rate of energy loss and energy gain are once again in balance.

But there’s another way to cause the pot to warm other than to add more heat: We can reduce its ability to cool. If next we place a lid on the pot, the pot will warm still more because the rate of heat loss is then reduced below the rate of heat gain from the stove. In this case, loosely speaking, the increased temperature of the pot is not because more heat is added, but because less heat is being allowed to escape.

Global Warming

The example of what causes a pot of water on a stove to warm is the same fundamental situation that exists with climate change in general, and global warming theory in particular. A change in the energy flows in or out of the climate system will, in general, cause a temperature change. The average temperature of the climate system (atmosphere, ocean, and land) will remain about the same only as long as the rate of energy gain from sunlight equals the rate of heat loss by infrared radiation to outer space. This is illustrated in the following cartoon:

global-energy-balance
Again, the average temperature of the Earth (like a pot of water on the stove) will only change when there is an imbalance between the rates of energy gained and energy lost.

What this means is that anything that can change the rates of energy flow illustrated above — in or out of the climate system — can cause global warming or global cooling.

In the case of manmade global warming, the extra carbon dioxide in the atmosphere is believed to be reducing the rate at which the Earth cools to outer space. This already occurs naturally through the so-called “greenhouse effect” of the atmosphere, a process in which water vapor, clouds, carbon dioxide and methane act as a ‘radiative blanket’, insulating the lower atmosphere and the surface, and raising the Earth’s average surface temperature by an average of 33 deg. C (close to 60 deg. F).

The Earth’s natural greenhouse effect is like the lid on our pot of water on the stove. The lid reduces the pot’s ability to cool and so makes the pot of water, on average, warmer than it would be without the lid. (I don’t think you will find the greenhouse effect described elsewhere in terms of an insulator — like a blanket — but I believe that is the most accurate analogy.) Similarly, the Earth’s natural greenhouse effect keeps the lower atmosphere and surface warmer than if there was no greenhouse effect. So, more CO2 in the atmosphere slightly enhances that effect.

And also like the pot of water, the other basic way to cause warming is to increase the rate of energy input — in the case of the Earth, sunlight. Note that this does not necessarily require an increase in the output of the sun. A change in any of the myriad processes that control the Earth’s average cloud cover can also do this. For instance, the IPCC talks about manmade particulate pollution (“aerosols”) causing a change in global cloudiness…but they never mention the possibility that the climate system can change its own cloud cover!

If the amount of cloud cover reflecting sunlight back to space decreases from, say, a change in oceanic and atmospheric circulation patterns, then more sunlight will be absorbed by the ocean. As a result, there will then be an imbalance between the infrared energy lost and solar energy gained by the Earth. The ocean will warm as a result of this imbalance, causing warmer and more humid air masses to form and flow over the continents, which would then cause the land to warm, too.

The $64 Trillion Question: By How Much Will the Earth Warm from More CO2?

Now for a magic number that we will be referring to later, which is how much more energy is lost to outer space as the Earth warms. It can be calculated theoretically that for every 1 deg C the Earth warms, it gives off an average of about 3.3 Watts per square meter more infrared energy to space. Just as you feel more infrared (heat) radiation coming from a hot stove than from a warm stove, the Earth gives off more infrared energy to space the warmer it gets.

This is part of the climate system’s natural cooling mechanism, and all climate scientists agree with this basic fact. What we don’t agree on is how the climate system responds to warming by either enhancing, or reducing, this natural cooling mechanism. The magic number — 3.3 Watts per sq. meter — represents how much extra energy the Earth loses if ONLY the temperature is increased, by 1 deg. C, and nothing else is changed. In the real world, however, we can expect that the rest of the climate system will NOT remain the same in response to a warming tendency.

Thus, the most important debate is global warming research today is the same as it was 20 years ago: How will clouds (and to a lesser extent other elements in the climate system) respond to warming, thereby enhancing or reducing the warming? These indirect changes that further influence temperature are called feedbacks, and they determine whether manmade global warming will be catastrophic, or just lost in the noise of natural climate variability.

Returning to our example of the whole Earth warming by 1 deg. C, if that warming causes an increase in cloud cover, then the 3.3 Watts of extra infrared loss to outer space gets augmented by a reduction in solar heating of the Earth by the sun. The result is a smaller temperature rise. This is called negative feedback, and is can be illustrated conceptually like this:

cloud-feedback-negative1

If negative feedback exists in the real climate system, then manmade global warming will become, for most practical purposes, a non-issue.

But this is not how the IPCC thinks nature works. They believe that cloud cover of the Earth decreases with warming, which would let in more sunlight and cause the Earth to warm to an even higher temperature. (The same is true if the water vapor content of the atmosphere increases with warming, since water vapor is our main greenhouse gas.) This is called positive feedback, and all 23 climate models tracked by the IPCC now exhibit positive cloud and water vapor feedback. The following illustration shows conceptually how positive feedback works:

cloud-feedback-positive

In fact, the main difference between models that predict only moderate warming versus those that predict strong warming has been traced to the strength of their positive cloud feedbacks.

How Mother Nature Fooled the World’s Top Climate Scientists

Obviously, the question of how clouds in the REAL climate system respond to a warming tendency is of paramount importance, because that guides the development and testing of the climate models. Ultimately, the models must be based upon the observed behavior of the atmosphere.

So, what IS observed when the Earth warms? Do clouds increase or decrease? While the results vary with which years are analyzed, it has often been found that warmer years have less cloud cover, not more.

And this has led to the ‘scientific consensus’ that cloud feedbacks in the real climate system are probably positive, although by an uncertain amount. And if cloud feedbacks end up being too strongly positive, then we are in big trouble from manmade global warming.

But at this point an important question needs to be asked that no one asks: When the climate system experiences a warm year, what caused the warming? By definition, cloud feedback can not occur unless the temperature changes…but what if that temperature change was caused by clouds in the first place?

This is important because if decreasing cloud cover caused warming, and this has been mistakenly interpreted as warming causing a decrease in cloud cover, then positive feedback will have been inferred even if the true feedback in the climate system is negative.

As far as I know, this potential mix-up between cause and effect — and the resulting positive bias in diagnosed feedbacks — had never been studied until we demonstrated it in a peer-reviewed paper in the Journal of Climate. Unfortunately, because climate research covers such a wide range of specialties, most climate experts are probably not even aware that our paper exists.

So how do we get around this cause-versus-effect problem when observing natural climate variations in our attempt to identify feedback? Our very latest research, now in peer review for possible publication in the Journal of Geophysical Research, shows that one can separate, at least partially, the effects of clouds-causing-temperature-change (which “looks like” positive feedback) versus temperature-causing-clouds to change (true feedback).

We analyzed 7.5 years of our latest and best NASA satellite data and discovered that, when the effect of clouds-causing-temperature-change is accounted for, cloud feedbacks in the real climate system are strongly negative. The negative feedback was so strong that it more than cancelled out the positive water vapor feedback we also found. It was also consistent with evidence of negative feedback we found in the tropics and published in 2007.

In fact, the resulting net negative feedback was so strong that, if it exists on the long time scales associated with global warming, it would result in only 0.6 deg. C of warming by late in this century.

Natural Cloud Variations: The Missing Piece of the Puzzle?

In this critical issue of cloud feedbacks – one which even the IPCC has admitted is their largest source of uncertainty — it is clear that the effect of natural cloud variations on temperature has been ignored. In simplest of terms, cause and effect have been mixed up. (Even the modelers will have to concede that clouds-causing-temperature change exists because we found clear evidence of it in every one of the IPCC climate models we studied.)

But this brings up another important question: What if global warming itself has been caused by a small, long-term, natural change in global cloud cover? Our observations of global cloud cover have not been long enough or accurate enough to document whether any such cloud changes have happened or not. Some indirect evidence that this has indeed happened is discussed here.

Even though they never say so, the IPCC has simply assumed that the average cloud cover of the Earth does not change, century after century. This is a totally arbitrary assumption, and given the chaotic variations that the ocean and atmosphere circulations are capable of, it is probably wrong. Little more than a 1% change in cloud cover up or down, and sustained over many decades, could cause events such as the Medieval Warm Period or the Little Ice Age.

As far as I know, the IPCC has never discussed their assumption that global average cloud cover always stays the same. The climate change issue is so complex that most experts have probably not even thought about it. But we meteorologists by training have a gut feeling that things like this do indeed happen. In my experience, a majority of meteorologists do not believe that mankind is mostly to blame for global warming. Meteorologists appreciate how complex cloud behavior is, and most tend to believe that climate change is largely natural.

Our research has taken this gut feeling and demonstrated with both satellite data and a simple climate model, in the language that climate modelers speak, how potentially serious this issue is for global warming theory.

And this cause-versus-effect issue is not limited to just clouds. For instance, there are processes that can cause the water vapor content of the atmosphere to change, mainly complex precipitation processes, which will then change global temperatures. Precipitation is what limits how much of our main greenhouse gas, water vapor, is allowed to accumulate in the atmosphere, thus preventing a runaway greenhouse effect. For instance, a small change in wind shear associated with a change in atmospheric circulation patterns, could slightly change the efficiency with which precipitation systems remove water vapor, leading to global warming or global cooling. This has long been known, but again, climate change research covers such a wide range of disciplines that very few of the experts have recognized the importance of obscure published studies like this one.

While there are a number of other potentially serious problems with climate model predictions, the mix-up between cause and effect when studying cloud behavior, by itself, has the potential to mostly deflate all predictions of substantial global warming. It is only a matter of time before others in the climate research community realize this, too.


White Roofs and Global Warming: A More Realistic Perspective

May 28th, 2009

There has been quite a lot of buzz in the last day or so about Energy Secretary Steven Chu claiming that making the roofs of buildings white (and brightening paved surfaces such as roads and parking lots) would “offset” a large part of the warming effect of humanity’s carbon dioxide emissions. The basis for this idea – which is not new — is that brighter surfaces reflect more of the sunlight reaching the Earth’s surface back to outer space. I found a presentation of the study he is basing his comments on by Hashem Akbari of Lawrence Berkeley National Lab here. A less technical summary of those results fit for public consumption is here.

I found some of the Akbari presentation calculations to be needlessly complicated, maybe because he was computing how much of humanity’s CO2 emissions in terms of billions of tons of CO2, or in millions of cars, could be offset. I dislike such numbers since they give no indication of what those mean relative to the whole atmosphere or their warming effect. Such big numbers lead people to buy hybrid cars while thinking they are helping to save the planet. As I have pointed out before, it takes 5 years of CO2 emissions by humanity in order to add just 1 molecule of CO2 to each 100,000 molecules of atmosphere. This obviously gives a very different impression than “billions of tons of CO2”.

So, we need to dispense with numbers that mislead people about what is really of interest: What fraction of the impact of more CO2 on the climate system can be alleviated through certain actions?

So, I thought I would make some calculations of my own to determine just what fraction of the ‘extra CO2 warming effect’ can be canceled out with the cooling effect of brighter roofs and pavement, on a global basis. Here are the pertinent numbers, along with some parenthetical comments.

The Goal: Offset the Radiative Forcing from More CO2 in the Atmosphere
Doubling of the CO2 content of the atmosphere has been estimated to cause 3.8 Watts per sq. meter of radiative forcing of the climate system, a value which will presumably be reached sometime late in this century. The current CO2 content of the atmosphere is about 40% higher than that estimated in pre-industrial times. Note this does NOT say anything about how much temperature change the radiative forcing would cause; that is a function of climate sensitivity – feedbacks – which I believe has been greatly overestimated anyway. For if climate sensitivity is low, then the extra CO2 in the atmosphere will have little impact on global temperatures, as will any of our ‘geoengineering’ attempts to offset it.

But let’s use that number as a basis for our goal: How much of the extra 3.8 Watts per sq. meter of radiative forcing by late in this century (or about 40% of that number today) can be offset by brightening these manmade surfaces?

This calculation is somewhat simpler than what Akbari presented, but assumes the intermediate calculations he made are basically correct. I will be multiplying together the following five numbers to estimate the Watts per sq. meter of radiative cooling (more sunlight reflected to outer space) that we might be able to achieve. Again, these numbers are the same or consistent with the numbers Akbari uses.

(1) The radiative forcing from a change in the Earth’s albedo (reflectivity): Akbari uses 1.27 Watts per sq. meter for a 0.01 change in albedo, which is the same as 127 Watts per sq. meter per unit change (1.0) in albedo. I think this might actually be an underestimate, but I will use the same value he uses.

(2) The fraction of global land area covered by urban areas: Akbari claims 1% (0.01) of land is now urbanized, a statistic I have seen elsewhere. I think this is an overestimate, but I will use it anyway.

(3) The fraction of the Earth covered by land: I’ll assume 30% (0.3).

(4) The fraction of urban areas that are modifiable: Akbari assumes 60% (25% roofs + 35% pavement). I think this number is high because Akbari is implicitly assuming the sun is always directly overhead during the daytime, whereas throughout the day the sides of buildings are illuminated by the sun more, and the roofs (and shadowed pavement) illuminated less, but I will use 60% (0.6) anyway.

(5) The assumed increase in albedo of the modified (brightened) surfaces: Akbari assumes an albedo increase of 0.4 for roofs and 0.15 for pavement, which when combined with his assumed urban coverage of 25% roofs and 35% pavement, results in a weighted average albedo increase of about 0.25.

When you multiply these 5 numbers together (127 x 0.01 x 0.3 x 0.6 x 0.25), you get about 0.06 Watts per sq. meter increase in reflected sunlight off the Earth from modifying 60% of all urban surfaces in the world. Comparing this to our goal — offsetting 3.8 Watts per sq. meter for a doubling of CO2 — we find you have only offset 1.6% of the carbon dioxide emissions.

Or, if today we could magically transform all of these surfaces instantly, we will have offset (1.6/0.4=) only 4% of the radiative forcing that has been accumulated from the last 100+ years of carbon dioxide emissions.

Of course, getting the whole world to ‘permanently’ recoat 60% of the surface area of their urban areas would be no small task, even to achieve such a small offset (1.6% by late in this century, or 4% today) of the warming effect of more CO2 in the atmosphere.

But this discussion does not address the positive energy savings from white roofs reducing the cooling load on buildings, something Akbari mentions as another benefit. From poking around on the internet I find that for single-story buildings, about 20% of the cooling load comes from the roof — not all of which is recoverable from a brighter roof. Energy savings would diminish for multi-story buildings, such as those in urban (as opposed to suburban) areas. [This source of energy savings can be estimated more easily. From other sources I find that about 5% of U.S. energy use is for air conditioning. If we could reduce that by, say, ten percent through brighter roofs, then 0.5% (one half of one percent) of our energy demand could be alleviated.]

And as I mentioned at the outset, if climate sensitivity is low, then the warming effect of more CO2 (as well as the cooling effect of any geoengineering ‘fixes’ to the problem) will have little effect on global temperatures, anyway.

The lesson here is that when put it in context of the total CO2 ‘problem’ — rather than just throwing big numbers around — the brightening of roofs and paved surfaces would offset only a tiny fraction of our CO2 emissions. The remaining question is just how much of this ambitious goal could be achieved, and at what cost.


The MIT Global Warming Gamble

May 23rd, 2009

the-mit-greenhouse-gamble-small31

Climate science took another step backward last week as a new study from the Massachusetts Institute of Technology was announced which claims global warming by 2100 will probably be twice as bad as the United Nations Intergovernmental Panel on Climate Change (IPCC) has predicted.

The research team examined a range of possible climate scenarios which combined various estimates of the sensitivity of the climate system with a range of possible policy decisions to reduce greenhouse gas emissions which (presumably) cause global warming. Without policy action, the group’s model runs “indicate a median probability of surface warming of 5.2 degrees Celsius by 2100, with a 90% probability range of 3.5 to 7.4 degrees”.

Since that average rate of warming (about 0.5 deg. C per decade) is at least 2 times the observed rate of global-average surface temperature rise over the last 30 years, this would require our current spate of no warming to change into very dramatic and sustained warming in the near future.

And the longer Mother Nature waits to comply with the MIT group’s demands, the more severe the warming will have to be to meet their projections.

Of course, as readers of this web site will know, the MIT results are totally dependent upon the climate sensitivity that was assumed in the climate model runs that formed the basis for their calculations. And climate modelers can get just about any level of warming they want by simply making a small change in the processes controlling climate sensitivity – especially cloud feedbacks — in those models.

So, since the sensitivity of the climate system is uncertain, these researchers followed the IPCC’s lead of using ‘statistical probability’ as a way of treating that uncertainty.

But as I have mentioned before, the use of statistical probabilities in this context is inappropriate. There is a certain climate sensitivity that exists in the real climate system, and it is true that we do not know exactly what that sensitivity is. But this does not mean that our uncertainty over its sensitivity can be translated into some sort of statistical probability.

The use of statistical probabilities by the IPCC and the MIT group does two misleading things: (1) it implies scientific precision where none exists, and (2) it implies the climate system’s response to any change is a “roll of the dice”.

We know what the probability of rolling a pair of sixes with dice is, since it is a random event which, when repeated a sufficient number of times, will reveal that probability (1 in 36). But in contrast to this simple example, there is instead a particular climate sensitivity that exists out there in the real climate system. The endless fascination with playing computer games to figure out that climate sensitivity, in my opinion, ends up wasting a lot of time and money.

True, there are many scientists who really do think our tinkering with the climate system through our greenhouse gas emissions is like playing Russian roulette. But the climate system tinkers with itself all the time, and the climate has managed to remain stable. There are indeed internal, chaotic fluctuations in the climate system that might appear to be random, but their effect on the whole climate system are constrained to operate within a certain range. If the climate system really was that sensitive, it would have forced itself into oblivion long ago.

The MIT research group pays lip service to relying on “peer-reviewed science”, but it looks like they treat peer-reviewed scientific publications as random events, too. If 99 papers have been published which claim the climate system is VERY sensitive, but only 1 paper has been published that says the climate system is NOT very sensitive, is there then a 99-in-100 (99%) chance that the climate system is very sensitive? NO. As has happened repeatedly in all scientific disciplines, it is often a single research paper that ends up overturning what scientists thought they knew about something.

In climate research, those 99 papers typically will all make the same assumptions, which then pretty much guarantees they will end up arriving at the same conclusions. So, those 99 papers do not constitute independent pieces of evidence. Instead, they might be better described as evidence that ‘group think’ still exists.

It turns out that the belief in a sensitive climate is not because of the observational evidence, but in spite of it. You can start to learn more about the evidence for low climate sensitivity (negative feedbacks) here.

As the slightly-retouched photo of the MIT research group shown above suggests, I predict that it is only a matter of time before the climate community placing all its bets on the climate models is revealed to be a very bad gamble.


Global Warming Causing Carbon Dioxide Increases: A Simple Model

May 11th, 2009

Edited May 15, 2009 to correct percentage of anthropogenic contribution to model, and minor edits for clarity.

Global warming theory assumes that the increasing carbon dioxide concentration in the atmosphere comes entirely from anthropogenic sources, and it is that CO2 increase which is causing global warming.

But it is indisputable that the amount of extra CO2 showing up at the monitoring station at Mauna Loa, Hawaii each year (first graph below) is strongly affected by sea surface temperature (SST) variations (second graph below), which are in turn mostly a function of El Nino and La Nina conditions (third graph below):
simple-co2-model-fig01
simple-co2-model-fig02
simple-co2-model-fig03

During a warm El Nino year, more CO2 is released by the ocean into the atmosphere (and less is taken up by the ocean from the atmosphere), while during cool La Nina years just the opposite happens. (A graph similar to the first graph also appeared in the IPCC report, so this is not new). Just how much of the Mauna Loa Variations in the first graph are due to the “Coke-fizz” effect is not clear because there is now strong evidence that biological activity also plays a major (possibly dominant) role (Behrenfeld et al., 2006). Cooler SST conditions during La Nina are associated with more upwelling of nutrient-rich waters, which stimulates plankton growth.

The direction of causation between carbon dioxide and SST is obvious since the CO2 variations lag the sea surface temperature variations by an average of six months, as shown in the following graph:
simple-co2-model-fig04

So, I keep coming back to the question: If warming of the oceans causes an increase in atmospheric CO2 on a year-to-year basis, is it possible that long-term warming of the oceans (say, due to a natural change in cloud cover) might be causing some portion of the long-term increase in atmospheric CO2?

I decided to run a simple model in which the change in atmospheric CO2 with time is a function of sea surface temperature anomaly. The model equation looks like this:

delta[CO2]/delta[t] = a*SST + b*Anthro

Which simply says that the change in atmospheric CO2 with time is proportional to some combination of the SST anomaly and the anthropogenic (manmade) CO2 source. I then ran the model in an Excel spreadsheet and adjusted an “a” and “b” coefficients until the model response looked like the observed record of yearly CO2 accumulation rate at Mauna Loa.

It didn’t take long to find a model that did a pretty good job (a = 4.6 ppm/yr per deg. C; b=0.1), as the following graph shows:
simple-co2-model-fig05

Since the long term rise in atmospheric CO2 has been averaging about 50% of human emissions, the 0.1 value for “b” means that 20% of long-term rise is anthropogenic, while the other 80% is natural for that particular model fit.

The peak correlation between the modeled and observed CO2 fluctuation is now at zero month time lag, supporting the model’s realism. The model explained 50% of the variance of the Mauna Loa observations.

The best model fit assumes that the temperature anomaly at which the ocean switches between a sink and a source of CO2 for the atmosphere is -0.2 deg. C, indicated by the bold line in the SST graph (the second graph in this article). In the context of longer-term changes, it would mean that the ocean became a net source of more atmospheric CO2 around 1930.

A graph of the resulting model versus observed CO2 concentration as a function of time is shown next:
simple-co2-model-fig06

If I increase the value of b from 0.1 to 0.2 (40% of the long term CO2 rise being anthropogenic), the following graph shows a somewhat different model fit that works better in the middle of the 50-year record, but then over-estimates the atmospheric CO2 concentration late in the record:
simple-co2-model-fig07

There will, of course, be vehement objections to this admittedly simple model. One will be that “we know the atmospheric CO2 increase is manmade because the C13 carbon isotope concentration in the atmosphere is decreasing, which is consistent with a fossil fuel source.” But has been discussed elsewhere, a change in ocean biological activity (or vegetation on land) has a similar signature…so the C13 change is not a unique signature of fossil fuel source.

My primary purpose in presenting all of this is simply to stimulate debate. Are we really sure that ALL of the atmospheric increase in CO2 is from humanity’s emissions? After all, the natural sources and sinks of CO2 are about 20 times the anthropogenic source, so all it would take is a small imbalance in the natural flows to rival the anthropogenic source. And it is clear that there are natural imbalances of that magnitude on a year-to-year basis, as shown in the first graph.

What could be causing long-term warming of the oceans? My first choice for a mechanism would be a slight decrease in oceanic cloud cover. There is no way to rule this out observationally because our measurements of global cloud cover over the last 50 to 100 years are nowhere near good enough.

And just how strenuous and vehement the resulting objections are to what I have presented above will be a good indication of how politicized the science of global warming has become.


REFERENCES
Michael J. Behrenfeld et al., “Climate-Driven Trends in Contemporary Ocean Productivity,” Nature 444 (2006): 752-755.

Why America Does Not Care About Global Warming

May 6th, 2009

It is clear that concern in the United States over global warming is diminishing. In a Washington Whispers editorial, Paul Bedard quotes a Gallup Poll editor as saying Al Gore’s campaign to raise awareness of the “climate crisis” has failed. In one recent Pew Research Center survey, global warming rated at the bottom of a list of 20 domestic issues that Americans are concerned about.

Why is there so much apathy in the U.S. over something that threatens to transform the world by killing off thousands of species, flooding coastal areas, and making the world as much as 10 degrees F hotter? Do we just not care about the environment? Has the global warming message been oversold? Is the public experiencing ‘global warming fatigue’? Does the global warming problem seem so insurmountable to people that they just want to ignore it and hope that it goes away?

From my travels around the country and talking to people, the largest source of apathy is none of these. In my experience, people simply do not believe the ‘scientific consensus’ is correct. Most people do believe the Earth has warmed, but they think that warming has been largely natural. This was recently supported by a Rasmussen Reports poll which showed only 1/3 of American voters now believe global warming is caused by humans.

How can non-experts question the opinion of scientific experts? I believe it is because the public seems to have a better appreciation than the scientists do of a fundamental truth: There are some problems that science does not yet understand. There have been predictions of environmental doom before, and those have all failed. This has made people suspicious of spectacular scientific claims. As I have mentioned before, even Mark Twain over 100 years ago made fun of the predilection scientists have for making grand extrapolations and pronouncements:

There is something fascinating about science. One gets such wholesale returns of conjecture from such a trifling investment of fact.
-Mark Twain, Life on the Mississippi (1883)

The scientific community has brought this problem upon themselves by not following their own rules and procedures. Scientists have ‘cried wolf’ too many times, and some day that might end up hurting all of us when some unusual and dangerous scientific concern does arise. I think that people intuitively understand that spectacular scientific claims require spectacular evidence. Just saying something ‘might’ happen ‘if current trends continue’ does not impress the public. They have heard it all before.

The analogy I sometimes think about is our understanding of the human brain. What if there was a group of researchers who built a computer model of how the brain works, and they claimed that they could take some measurements of your brain and tell you what you would be thinking 24 hours from now. Would you believe them?

No, even though you are not an expert regarding the operation of the human mind, you probably would not believe them. From your daily experience you would suspect that those experts were probably overreaching, and claiming they knew more than they really did.

Of course, if the experts had performed such experiments before and succeeded, then you might be more inclined to believe them. But in the climate business, we have no previous forecast successes that are relevant to the theory of manmade global warming. We can’t even forecast natural climate variations, because we do not understand them. Simply forecasting long-term warming probably has a 50/50 chance of being correct just by accident, since it seems to be more common for warming or cooling to occur than for the temperature to remain constant, year after year, decade after decade.

And coming up with possible explanations for what has happened in the past (‘hindcasts’) do not really count, either. It is too easy to happen upon the wrong explanation which can be made to fit the data, a technique scientists call (rather pejoratively) “curve-fitting”.

In weather forecasting, MANY forecasts are required before one can confidently determine, based upon the number of successes and failures, whether those forecasts had any real skill beyond what might be expected just based upon chance. Climate forecasting is nowhere near being able to demonstrate forecast skill to the level of confidence that is routinely discussed in weather forecasting.

So, for those of you in the environmental community who think the global warming message needs to be repackaged, or rephrased, or have a change in terminology …well…I think you are wasting your time. The people have gotten the message, loud and clear: Global warming is manmade, and it is only going to get worse.

The trouble is that the people simply don’t think you know what you are talking about. And if global warming is largely a natural process, cutting down on our greenhouse gas emissions is going to have no measurable effect on future global temperatures.

Now, if you think you might succeed through a different kind of deception of the public…well, that indeed might work.