My Favorite Renewable Energy Concept: The Solar Updraft Tower

August 5th, 2009

There are many different ways that you can extract usable energy from sunlight, and they all have their advantages and disadvantages. Historically, the biggest disadvantage has been cost when compared to more traditional sources of energy, such as coal-fired power plants. For if solar power was an economical and practical alternative to other forms of energy generation today, it would already be deployed on a wide scale.

I think it is only a matter of time before renewable energy sources become more cost competitive. The question is which methods make the most sense. My favorite idea is the ‘Solar Tower’ (or ‘solar updraft tower’, or ‘solar chimney’), an artists rendering of which is shown below.

Solar-Tower

While most people have never heard about it, the Solar Tower design was implemented on a small scale in Spain years ago to test the concept. More recently, a privately-funded company called EnviroMission has been working toward the construction of one or more 200 megawatt power plants in the Australian Outback. The company has also been actively pursuing plans to build power plants in China and Nevada.

The design appeals to me because it harnesses the weather, albeit on a small scale. Specifically, it collects the daily production of warm air that forms near the ground, and funnels all of that warm air into a chimney where turbines are located to extract energy from the rising air. It’s a little like wind tower technology, but rather than just extracting energy from whatever horizontally-flowing wind happens to be passing by, the Solar Tower concentrates all of that warm air heated by the ground into the central tower, or chimney, where the air naturally rises. Even on a day with no wind, the solar tower will be generating electricity while conventional wind towers are sitting there motionless.

The total amount of energy that can be generated by a Solar Tower depends upon two main factors: (1) how much land area is covered by the clear canopy, and (2) the total height of the tower. EnviroMission’s baseline design has included a glass canopy covering up to several square miles of desert land, and a tower 1,000 meters tall. Such a tower would be the tallest manmade structure of any kind in the world, although more recently EnviroMission has been talking about several smaller-scale power plants as a more cost-effective approach. Since the design is proprietary, details have remained secret.

Since the Solar Tower is based upon physical processes that people like me deal with routinely in our research, I can immediately see ways in which the efficiency of the design can be maximized. For instance, a third major factor that also determines how much energy would be generated is the temperature difference between the power plant’s surroundings and the air underneath the canopy. After all, it is that temperature difference which provides the energy source, since warm air is less dense than cool air, and so ‘wants’ to rise.

This means that you could increase the power plant’s output by building it where the sand is quite reflective (bright), and then covering the ground under the canopy area with black rock — say crushed lava rock — which would then get the hottest when the sun shines on it.

One of the advantages of a Solar Tower over using photovoltaic cells to generate electricity is that the Solar Tower keeps generating electricity even after the sun goes down. Because the ground under the canopy stays warm at night, it continues to warm the air while the land around the canopy cools much more rapidly. This maintains a temperature difference between the canopy-covered air and the plant’s surroundings, which translates into continued energy generation at night. Additionally, the Solar Tower does not require the huge volume of water that coal-fired plants use.

From what I’ve read, the Solar Tower is potentially cost-competitive with coal-fired power plants, but the investment in infrastructure is large, and there is still some uncertainty (and therefore investment risk) involved in just how efficient Solar Towers would be.

If the government insists on providing subsidies for renewable energy, I think Solar Towers might be one of the best investments of the public’s money. But the last I knew, the U.S. Department of Energy was not actively pursuing the Solar Tower technology. I have no idea whether government involvement would help or hurt the private efforts of EnviroMission. Ultimately, the technology needs to be sustainable from a cost standpoint, and when that point is reached it is best if the government stays out of the way.

But just from a political standpoint, I think the Obama administration would benefit by pursuing Solar Towers as a national goal to reduce our dependence on foreign energy. If electricity in the sunnier parts of the country was cheap enough, then plug-in hybrid cars would become more popular there as well, which would reduce our dependence on foreign oil.

A very cool computer-generated video tour of an EnviroMission Solar Tower design can be viewed here (be sure to turn the sound up!). As the video shows, a Solar Tower 1,000 meters tall would also provide quite a tourist attraction.


STILL No Tropical Storms? Must Be Global Warming

August 3rd, 2009

So, where are all of the news stories about the fact we’ve had no tropical storms yet this year? As can be seen in the following graphic, as of this date in 2005 we already had 8 named storms in the Atlantic basin. And tomorrow, August 4, that number will increase to 9. In 2005 we were even told to expect more active hurricane seasons from now on because of global warming.

named-storms-climatology

Of course, even though it is interesting that the 2009 tropical season is off to such a slow start, it may well have no significance in terms of long-term trends. But the lack of news coverage on the subject does show the importance of unbiased reporting when it comes to global warming. Let me explain.

Let’s say we really were in a slow, long-term cooling trend. What if the media decided they would only do news stories when there are record high temperatures or heat waves, ignoring record cold, and would then attribute those events to human-caused global warming? This would end up making the public fearful of global warming, even if the real threat was from global cooling.

The public expects – or used to expect – the media to report on all sides of important issues, so that we can be better informed on the state of the world. There have always been high temperature records set, and there have always been heat waves. In some sense, unusual weather is normal. It might not happen every day, but you can be assured, it will happen. But reporting on heat-related events while ignoring cold temperature records or events that do not support the claims of global warming theorists, will lead to a bias in the way the public views climate change.

Of course, someone might come along and claim that global warming has disrupted tropical storm activity this year, and so an unusually quiet season will also be claimed as evidence of global warming. This has already happened to some extent with cold weather and more snow being blamed on global warming.

But when the natural climate cycle deniers reach that level of desperation, they only appear that much more ridiculous to those of us who have not yet lost our ability to reason.


Rise of the Natural Climate Cycle Deniers

July 29th, 2009

Those who promote the theory that mankind is responsible for global warming have been working for the past 20 years on a revisionist climate history. A history where climate was always in a harmonious state of balance until mankind came along and upset that balance.

The natural climate cycle deniers have tried their best to eliminate the Medieval Warm Period and the Little Ice Age from climate data records by constructing the uncritically acclaimed and infamous “hockey stick” of global temperature variations (or non-variations) over the last one- to two-thousand years.

Before being largely discredited by a National Academies review panel, this ‘poster child’ for global warming was heralded as proof of the static nature of the climate system, and that only humans had the power to alter it.

While the panel was careful to point out that the hockey stick might be correct, they said that the only thing science could say for sure is that it has been warmer lately than anytime in the last 400 years. Since most of those 400 years was during the Little Ice Age, I would say this is a good thing. It’s like saying this summer has been warmer than any period since…last fall.

These deniers claim that the Medieval Warm Period was only a regional phenomenon, restricted to Europe. Same for the Little Ice Age. Yet when a killer heat wave occurred in France in 2003, they hypocritically insisted that this event had global significance, caused by anthropogenic ‘global’ warming.

The strong warming that occurred up until 1940 is similarly a thorn in the side of the natural climate cycle deniers, since atmospheric carbon dioxide increases from fossil fuel burning before 1940 were too meager to have caused it. So, the ‘experts’ are now actively working on reducing the magnitude of that event by readjusting some ship measurements of ocean temperatures from that era.

Yet, they would never dream of readjusting the more recent thermometer record, which clearly has localized urban heat island effects that have not yet been removed (e.g., see here and here). As Dick Lindzen of MIT has pointed out, it is highly improbable that every adjustment the climate revisionists ever make to the data should always just happen to be in the direction of agreeing with the climate models.

Of course, global warming has indeed occurred…just as global cooling has occurred before, too. While the global warming ‘alarmists’ claim we ‘skeptics’ have our heads stuck in the sand about the coming climate catastrophe, they don’t realize their heads are stuck in the sand about natural climate variability. Their repeated referrals to skeptic’s beliefs as “denying global warming” is evidence of either their dishonesty, or their stupidity.

The climate modelers’ predictions of the coming global warming Armageddon is of a theoretical event in the distant future, created by mathematical climate models, and promoted by scientists and politicians who have nothing to lose since it will be decades before they are proved wrong. They profess the utmost confidence in these theoretical predictions, yet close their eyes and ears to the natural rhythms exhibited by nature, both in the living and non-living realms, in the present, and in the previously recorded past.

They readily admit that cycles exist in weather, but can not (or will not) entertain the possibility that cycles might occur in climate, too. Every change the natural cycle deniers see in nature is inevitably traced to some evil deed done by humans. They predictably prognosticate such things as, “If this trend continues, the Earth will be in serious trouble”. To them behavior of nature is simple, static, always in-balance – if not sacred…in a quasi-scientific sort of way, of course.

They can not conceive of nature changing all by itself, even though evidence of that change is all around us. Like the more activist environmentalists, their romantic view of a peaceful, serene natural world ignores the stark reality that most animals on the Earth are perpetually locked in a life-or-death struggle for existence. The balances that form in nature are not harmonious, but unsteady and contentious stalemates — like the Cold War between the United States and the former Soviet Union.

Meanwhile, humans are doing just what the other animals are doing: modifying and consuming their surroundings in order to thrive. The deniers curiously assert that all other forms of life on the planet have the ‘right’ to do this – except humans.

And when the natural cycle deniers demand changes in energy policy, most of them never imagine that they might personally be inconvenienced by those policies. Like Al Gore, Robert F. Kennedy, Jr., and Leonardo DiCaprio, they scornfully look down upon the rest of humanity for using up the natural resources that they want for themselves.

And the few who freely choose to live such a life then want to deny others the freedom to choose, by either regulating or legislating everyone else’s behavior to conform to their own behavior.

The natural climate cycle deniers’ supposedly impartial science is funded by government research dollars that would mostly dry up if the fears of manmade global warming were to evaporate. With contempt they point at the few million dollars that Exxon-Mobil spent years ago to support a few scientists who maintained a healthy skepticism about the science, while the scientific establishment continues to spent tens of billions of your tax dollars.

So, who has the vested financial interest here?

Even the IPCC in its latest (2007) report admits that most of the warming in the last 50 years might be natural in origin — although they consider it very unlikely, with (in their minds) less than 10% probability. So, where is the 10% of the global warming research budget to study that possibility? It doesn’t exist, because — as a few politicians like to remind us — “the science is settled”.

The natural climate cycle deniers claim to own the moral high ground, because they are saving future generations from the ravages of (theoretical) anthropogenic climate change. A couple of them have called for trials and even executions of scientists who happen to remain skeptical of humanity being guilty of causing climate change.

Yet the energy policies they advocate are killing living, breathing poor people around the world, today. Those who are barely surviving in poverty are being pushed over the edge by rising corn prices (because of ethanol production), and decimated economies from increasing regulation and taxation of carbon based fuels in countries governed by self-righteous elites.

But the tide is turning. As the climate system stubbornly refuses to warm as much as 95% of the climate models say it should be warming, the public is turning skeptical as well. Only time will tell whether our future is one of warming, or of cooling. But if the following average of 18 proxies for global temperatures over the last 2,000 years is any indication, it is unlikely that global temperatures will remain constant for very long.

The above graph shows an average of 18 non-tree ring proxies of temperature from 12 locations around the Northern Hemisphere, published by Craig Loehle in 2007, and later revised in 2008, clearly showing that natural climate variability happens with features that coincide with known events in human history.

As Australian geologist Bob Carter has been emphasizing, we shouldn’t be worrying about manmade climate change. We should instead fear that which we know occurs: natural climate change. Unfortunately, it is the natural climate cycle deniers who are now in control of the money, the advertising, the news reporting, and the politicians.


New Study in Science Magazine: Proof of Positive Cloud Feedback?

July 26th, 2009

(edited 12:15 p.m. 7/26/09 for clarity)
(3:25 p.m. DOH! Hawaii IS part of the U.S…)

I’m getting a lot of e-mails asking about a new study by Clement et al. published last week in Science, which shows that since the 1950s, periods of warmth over the northeastern Pacific Ocean have coincided with less cloud cover. The authors cautiously speculate that this might be evidence of positive cloud feedback.

This would be bad news for the Earth and its inhabitants since sufficiently strong positive cloud feedbacks would have the potential of amplifying the small amount of direct warming from our carbon dioxide emissions to disastrous proportions.

The authors are appropriately cautious about the interpretation of their results, which are indeed interesting. The very fact that the only 2 IPCC climate models that behaved in a manner similar to the observations were the most sensitive AND the least sensitive models shows that interpretation of the study results as proof of positive cloud feedback would be very premature.

But how could such a dichotomy exist? How could what seems to be clear evidence of positive feedback in the observations agree with both the climate model that predicts the MOST global warming for our future, as well as the model that predicts the LEAST warming for our future?

In my view, the interpretation of their results in terms of cloud feedback has the same two problems that previous studies have had. These problems have to do with (1) the regional character of the study, and (2) the issue of causation when analyzing cloud and temperature changes.

Problem #1: Interpretation of Feedbacks from a Regionally-Limited Study
Back in 1991, Ramanathan and Collins published a paper in Nature which indirectly argued for negative cloud feedbacks based upon the observation that regional warming in the deep tropics is accompanied by more convective cloud (thunderstorm) activity, which then shades the ocean from solar heating. This was the basis for what became known as the “Thermostat Hypothesis”, an unfortunate name since there are many potential thermostatic mechanisms in the climate system.

But as subsequently pointed out by other researchers, cloud feedbacks can not be deduced based upon the behavior of only one branch of vertical atmospheric circulation systems — in their case the ascending branches of the tropical and subtropical atmospheric circulation system known as the Hadley and Walker circulations. This is because a change in the ascending branch of these circulations, which occurs over the warmest waters of the deep tropics, is always accompanied by a change in the descending branch, and the two changes usually largely cancel out.

The new Clement et al. study has the same problem, but in their case they studied changes in the strength of one portion of the descending branch of an atmospheric circulation system, generally between Hawaii and Mexico, where there is little precipitation and relatively sunny conditions prevail. So, even if the regional cloud response they measured was indeed feedback in origin (the ‘causation’ issue which I address as Problem #2, below), it must be lumped in with whatever regional changes occurred elsewhere in concert with it before one can meaningfully address cloud feedbacks.

Unfortunately, further complicating feedback diagnosis is the fact that these atmospheric circulation cells are interconnected all around the world. And since cloud feedbacks are, strictly speaking, most meaningfully addressed only when the whole circulation system is included — both ascending and descending branches — we need global measurements. Except for our relatively recent satellite monitoring capabilities, though, we do not have sufficiently accurate cloud measurements over all regions of the Earth to do this over any extended period of time, such as the Clement study that used ship observations extending back to the 1950s.

But there is a bigger – and less well appreciated — problem in the inference of positive cloud feedback from studies like that of Clement et al.: that of causation.

Problem #2: The Importance of Causation in Determining Cloud Feedbacks
I am now convinced that the causation issue is at the heart of most misunderstandings over feedbacks. It is the issue that I spend most of my research time on, and we have been studying it with output from 18 of the fully coupled climate models tracked by the IPCC, a simple climate model we developed, and with satellite data.

[As an aside, as a result of reviews of our extensive paper on the subject that was submitted to the Journal of Geophysical Research, we are revamping the paper at the request of the journal editor, and working on a resubmission. So, I am hopeful that it will eventually be published in some form in JGR.]

Using the example of the new Clement et al. study (the observation that periods of unusual warmth on the northeast Pacific coincided with periods of less cloud cover)…what if most of that warming was actually caused by the decrease in clouds, rather than the decrease in clouds being caused by the warming (which would be, by definition, positive cloud feedback)? In other words, what if causation is actually working in the opposite direction to feedback?

It turns out that, when a circulation-induced change in clouds causes a temperature change, it can almost totally obscure the signature of feedback — even if the feedback is strongly negative. This is easily demonstrated with a simple forcing-feedback model, which is what one portion of our new JGR paper submission deals with. It was also demonstrated theoretically in our 2008 Journal of Climate paper.

In other words, a cloud change causing a temperature change gives the illusion of positive feedback – even if negative feedback is present.

In the case of the new Science magazine study, one of the major changes seen in temperature and cloudiness (and other weather parameters) occurred during the Great Climate Shift of 1977, an event which is known to have been accompanied by changes in atmospheric circulation patterns, such as immediate and prolonged warming in Alaska and the Arctic. And since circulation changes can cause cloud changes, this is one example of a situation where one can be fooled regarding the direction of causation.

This issue of the direction of causation is not easy to get around. While I’ve found some researchers who think this is only a matter of semantics, and who claim that all we need to know is how clouds and temperature vary together, our 2008 J. Climate paper demonstrated that this is definitely not the case.

The bottom line is that it is very difficult to infer positive cloud feedback from observations of warming accompanying a decrease in clouds, because a decrease in clouds causing warming will always “look like” positive feedback.

Based upon the few conversations I have had with other researchers in the field on this subject, the issue of causation remains a huge source of confusion among climate experts on the subject of feedbacks. Even though our 2008 Journal of Climate paper on the subject outlined the basic issue, the climate research community has still failed to grasp its significance. Hopefully, the more thorough treatment we provide in our JGR resubmission will help the community better understand the problem — if it ever gets published.


How Do Climate Models Work?

July 13th, 2009

Since fears of manmade global warming — and potential legislation or regulations of carbon dioxide emissions — are based mostly upon the output of climate models, it is important for people to understand the basics of what climate models are, how they work, and what their limitations are.

Climate Models are Computer Programs

Generally speaking, a climate model is a computer program mostly made up of mathematical equations. These equations quantitatively describe how atmospheric temperature, air pressure, winds, water vapor, clouds, and precipitation all respond to solar heating of the Earth’s surface and atmosphere. Also included are equations describing how the so-called “greenhouse” elements of the atmosphere (mostly water vapor, clouds, carbon dioxide, and methane) keep the lower atmosphere warm by providing a radiative ‘blanket’ that partly controls how fast the Earth cools by loss of infrared to outer space.

The equation computations are made at individual gridpoints on a three-dimensional grid covering the Earth (see image below).
climate-model-1

In “coupled” climate models, there are also equations describing the three-dimensional oceanic circulation, how it transports absorbed solar energy around the Earth, and how it exchanges heat and moisture with the atmosphere. Modern coupled climate models also include a land model that describes how vegetation, soil, and snow or ice cover exchange energy and moisture with the atmosphere.

You can make computer visualizations of how these processes evolve as the model is run on the computer, such as the nice example shown below produced by Gary Strand at the National Center for Atmospheric Research (NCAR). This particular image shows sea surface temperatures, near-surface winds, and sea ice concentrations in one of the NCAR models at some point during a run of the model on a supercomputer. [Note the model does not have an actual physical shape in the computer…it is just a long series of computations.]
climate-model-2

If you want to see how a climate model simulation evolves over time, a striking YouTube video of the NCAR CCSM climate model is shown here.

The Importance of Energy Balance in Climate Models

Climate models are usually used to study how the Earth’s climate might respond to small changes in either the intensity of sunlight being absorbed by the Earth, or in the case of anthropogenic global warming, the addition of manmade greenhouse gases that further reduce the atmosphere’s ability to cool to outer space.

For it is the balance between these two flows of radiant energy – solar energy in, and infrared energy out of the climate system — that is believed to control the average temperature of the climate system over the long run. If the two radiant energy flows are in balance, then the average temperature of the climate system remains pretty constant. If they are out of balance, the average temperature of the climate system can be expected to change.

If this fundamental concept of “energy balance” sounds foreign to you, it shouldn’t because it is part of your everyday experience. For instance, the temperature of a pot of water warming on the stove will only increase as long as the rate of energy gain from the stove is greater than the rate of energy loss by the pot to its surroundings. Once the pot warms to the point where the rate of energy loss equals the rate of energy gain, its temperature then will remain constant.

Similarly, the temperature of the inside of a car sitting in the sun will increase only until the rate at which sunlight is absorbed by the car equals the rate at which heat is lost by the hot car to its surroundings.

In the same manner, any imbalance in the average flows of energy in and out of the climate system can be expected to cause a temperature change. When averaged over the whole Earth, the rates of absorbed solar energy and infrared energy lost are estimated to be about 235 or 240 Watts per square meter. I say “estimated” because our satellite system for measuring the radiative energy budget of the Earth is still not quite good enough to measure it to this level of absolute accuracy.

A variety of adjustable parameters in the model are tuned until the model approximates the average seasonal change in weather patterns around the world, and also absorbs sunlight and emits infrared energy to space at a global-average rate of about 235 or 240 Watts per sq. meter. The modelers tend to assume that if the model does a reasonably good job of mimicking these basic features of the climate system, then the model will be able to predict global warming. This might or might not be a good assumption – no one really knows.

It is also important to understand that even if a climate model handled 95% of the processes in the climate system perfectly, this does not mean the model will be 95% accurate in its predictions. All it takes is one important process to be wrong for the models to be seriously in error. For instance, how the model alters cloud cover with warming can make the difference between anthropogenic global warming being catastrophic, or just lost in the noise of natural climate variability.

Anthropogenic Global Warming in Climate Models

Our addition of carbon dioxide to the atmosphere by burning fossil fuels is estimated to have caused an imbalance of about 1.5 Watts per sq. meter between the 235 to 240 Watts per sq. meter of average absorbed sunlight and emitted infrared radiation. The extra CO2 makes the infrared greenhouse blanket covering the Earth slightly thicker. This energy imbalance is too small to be measured from satellites; it must be computed based upon theory.

So, if the Earth was initially in a state of energy balance, and the rate of sunlight being absorbed by the Earth was exactly 240 Watts per sq. meter, then the rate of infrared loss to outer space would have been reduced from 240 Watts per sq. meter to 238.5 Watts per sq. meter (240 minus 1.5).

This energy imbalance causes warming in the climate model. And since a warmer Earth (just like any warmer object) loses infrared energy faster than a cool object, the modeled climate system will warm up until energy balance is once again is restored. At that point, the rate at which infrared energy is lost to space once again equals the rate at which sunlight is absorbed by the Earth, and the temperature will once again remain fairly constant.

What Determines How Much the Model will Warm?

The largest source of uncertainty in climate modeling is this: will the climate system act to reduce, or enhance, the small amount of CO2 warming?

The climate model (as well as the real climate system) has different ways in which an energy imbalance like that from adding CO2 to the atmosphere can be restored. The simplest response would be for the temperature alone to increase. For instance, it can be calculated theoretically that the ~40% increase in atmospheric CO2 humans are believed to have caused in the last 150 years would only cause about 0.5 deg. C warming to restore energy imbalance. This theoretical response is called the “no feedback” case because nothing other than temperature changed.

But a change in temperature can be expected to change other elements of the climate system, like clouds and water vapor. These other, indirect changes are called feedbacks, and they can either amplify the CO2-only warming, or reduce it. As shown in the following figure, all 20+ climate models currently tracked by the United Nations’ Intergovernmental Panel on Climate Change (IPCC) now amplify the warming.
21-ipcc-climate-models

This amplification is mostly through an increase in water vapor — Earth’s main greenhouse gas — and through a decrease in low- and middle-altitude clouds, the primary effect of which is to let more sunlight into the system and cause further warming. These indirect changes in response to warming are called feedbacks. The models amplify the CO2 warming with positive water vapor feedback, and with positive cloud feedback.

But is this the way that the real climate system operates?

Uncertainties in Climate Model Cloud and Water Vapor Processes

The climate model equations are only approximations of the physical processes that occur in the atmosphere. While some of those approximations are highly accurate, some of the most important ones from the standpoint of climate change are unavoidably crude. This is because the real processes they represent are either (1) too complex to include in the model and still have the model run fast on a computer, or (2) because our understanding of those processes is still too poor to accurately model them with equations.

This is especially true for cloud formation and dissipation, which in turn has a huge impact on how much sunlight is absorbed by the climate system. The amount of cloud cover generated in the model in response to solar heating helps control the Earth’s temperature, so the manner in which clouds change with warming is of huge importance to global warming predictions.

Climate modelers are still struggling to get the models to produce cloud cover amounts and types like those seen in different regions, and during different seasons. The following NASA MODIS image of the western U.S. and eastern Pacific Ocean shows a wide variety of cloud types which are controlled by a variety of different processes.
blue-marble-west-us

The complexity of clouds is intuitively understood by everyone, experts and non-experts alike. It is probably safe to say that all climate modelers recognize that the modeling of cloud behavior accurately is very difficult, and is something which has not yet been achieved in global climate models.

All of the IPCC climate models reduce low- and middle-altitude cloud cover with warming, a positive feedback. This is the main reason for the differences in warming produced by different climate models (Trenberth and Fasullo, 2009). I predict that this kind of model behavior will eventually be shown to be incorrect. And while the authors were loathe to admit it, there is already some evidence showing up in the peer-reviewed scientific literature that this is the case (Spencer et al., 2007; Caldwell and Bretherton, 2009).

I believe that the modelers have mistakenly interpreted decreased cloud cover with warming in the real climate system as positive cloud feedback (warming causing a cloud decrease), when in reality it was actually the decrease in clouds that mostly caused the warming. This is basically an issue of causation: one direction of causation has been ignored when trying to estimate causation in the opposite direction (Spencer and Braswell, 2008).

The fundamental issue of causation in climate modeling isn’t restricted to just clouds. While warming will, on average, cause an increase in low-level water vapor, precipitation systems control the water vapor content of most of the rest of the atmosphere. As shown in the following illustration, while evaporation over most of the Earth’s surface is continuously trying to enhance the Earth’s natural greenhouse effect by adding water vapor, precipitation is continuously reducing the greenhouse effect by converting that water vapor into clouds, then into precipitation.
evaporation-precipitation

But while the physics of evaporation at the Earth’s surface is understood pretty well, the processes controlling the conversion of water vapor into precipitation in clouds are complex and remain rather mysterious. And it is the balance between these two processes — evaporation and precipitation — that determines atmospheric humidity.

Even in some highly complex ‘cloud resolving models’ – computer models that use much more complex computations to actually ‘grow’ clouds in the models – the point at which a cloud starts precipitating in the model is given an ad hoc constant value. I consider this to be a huge source of uncertainty, and one that is not appreciated even by most climate modelers. The modelers tune the models to approximate the average relative humidity of the atmosphere, but we still do not understand from ‘first principles’ why the average humidity has its observed value. We would have to thoroughly understand all of the precipitation processes, which we don’t.

In the end, many of the approximations in climate models will probably end up being not very important for forecasting climate change…but it takes only one critical process to be wrong for model projections of warming to be greatly in error. The IPCC admits that their largest source of uncertainty is low cloud feedback, that is, how low cloud cover will change with warming. And, as just mentioned, I believe how precipitation efficiency might change with temperature is also a wild card in climate model predictions.

Sources of Global Warming: Humans or Nature?

At this point hopefully you understand that climate modelers think global warming is the result of humans ‘upsetting’ the Earth’s radiative energy balance. And I agree with them that adding more carbon dioxide to the atmosphere must have some effect…but how large is this change in comparison the energy imbalances the climate system imposes upon itself?

It turns out that the modelers have made a critical assumption that ends up leading to the their conclusion that the climate system is very sensitive to our greenhouse gas emissions: that the climate system was in a state of energy balance in the first place.

There is a pervasive, non-scientific belief in the Earth sciences that nature is in a fragile state of balance. Whether it is ecosystems or the climate systems, you will hear or read scientists claims about the supposed fragility of nature.

But this is a subjective concept, not a scientific one. Still, it makes its way into the scientific literature (read the abstract to this seminal paper on the first satellite measurements of the Earth’s energy budget…look for “delicately balanced”). Just because nature tends toward a balance does not mean that balance is in any way ‘fragile’. And what does ‘fragile’ even mean when nature itself is always upsetting that balance anyway?

Why is this important to climate modeling? Because if climate researchers ignore naturally-induced climate variability, and instead assume that most climate changes are due to the activity of humans, they will inevitably come to the conclusion that the climate system is fragile: that is, that feedbacks are positive. It’s a little like some ancient tribe of people believing that severe weather events are the result of their moral transgressions.

If the warming observed during the 20th Century was due to human greenhouse gas emissions, then the climate system must be pretty sensitive (positive feedbacks). But if the warming was mostly due to a natural change in cloud cover, then the climate system is more likely to be insensitive (negative feedbacks). And there is no way to know whether natural cloud changes occurred during that time simply because our global cloud observations over the last century are nowhere near accurate enough.

So, climate modelers simply assume that there are no natural long-term changes in clouds, water vapor, etc. But they do not realize that in the process they will necessarily come to the conclusion that the climate system is very sensitive (feedbacks are positive). As a result, they program climate models so that they are sensitive enough to produce the warming in the last 50 years with increasing carbon dioxide concentrations. They then point to this as ‘proof’ that the CO2 caused the warming, but this is simply reasoning in a circle.

Climate modelers have simply assumed that the Earth’s climate system was in a state of energy balance before humans started using fossil fuels. But as is evidenced by the following temperature reconstruction for the last 2,000 years (from Loehle, 2007), continuous changes in temperature necessarily imply continuous changes in the Earth’s energy balance.
2000-years-of-global-temperatures-small

And while changes in solar activity are one possible explanation for these events, it is also possible that there are long-term, internally-generated fluctuations in global energy balance brought about by natural cloud or water vapor fluctuations. For instance, a change in cloud cover will change the amount of sunlight being absorbed by the Earth, thus changing global temperatures. Or, a change in precipitation processes might alter how much of our main greenhouse gas — water vapor — resides in the atmosphere. Changes in either of these will cause global warming or global cooling.

But just like the tribe ancient people not understanding that there are physical processes at work in nature that cause storms to occur, climate modelers tend to view climate change as something that is largely human in origin – presumably the result of our immoral burning of fossil fuels.

Faith-Based Climate Modeling

There is no question that much expense and effort has gone into the construction and improvement of climate models. But that doesn’t mean those models can necessarily predict climate 20, 50, or 100 years from now. Ultimately, the climate researcher (and so the politician) must take as a matter of faith that today’s computerized climate models contain all of the important processes necessary to predict global warming.

This is why validating the predictions of any theory is so important to the progress of science. The best test of a theory is to see whether the predictions of that theory end up being correct. Unfortunately, we have no good way to rigorously test climate models in the context of the theory that global warming is manmade. While some climate modelers will claim that their models produce the same “fingerprint” of manmade warming as seen in nature, there really is no such fingerprint. This is because warming due to more carbon dioxide is, for all practical purposes, indistinguishable from warming due to, say, a natural increase in atmospheric water vapor.

The modeler will protest, “But what could cause such a natural change in water vapor?” Well, how about just a small change in atmospheric circulation patterns causing a decrease in low cloud cover over the ocean? That would cause the oceans to warm, which would then warm and humidify the global atmosphere (Compo and Sardeshmukh, 2009). Or how about the circulation change causing a change in wind shear across precipitation systems? This would lead to a decrease in precipitation efficiency, leading to more water vapor in the atmosphere, also leading to a natural ‘greenhouse’ warming (Renno et al., 1994).

To reiterate, just because we don’t understand all of the ways in which nature operates doesn’t mean that we humans are responsible for the changes we see in nature.

The natural changes in climate I am talking about can be thought of as ‘chaos’. Even though all meteorologists and climate researchers agree that chaos occurs in weather, climate modelers seem to not entertain the possibility that climate can be chaotic as well (Tsonis et al., 2007). If they did believe that was possible, they would then have to seriously consider the possibility that most of the warming we saw in the 20th Century was natural, not manmade. But the IPCC remains strangely silent on this issue.

The modelers will claim that their models can explain the major changes in global average temperatures over the 20th Century. While there is some truth to that, it is (1) not likely that theirs is a unique explanation, and (2) this is not an actual prediction since the answer (the actual temperature measurements) were known beforehand.

If instead the modelers were NOT allowed to see the temperature changes over the 20th Century, and then were asked to produce a ‘hindcast’ of global temperatures, then this would have been a valid prediction. But instead, years of considerable trial-and-error work has gone into getting the climate models to reproduce the 20th Century temperature history, which was already known to the modelers. Some of us would call this just as much an exercise in statistical ‘curve-fitting’ as it is ‘climate model improvement’.

The point is that, while climate models currently offer one possible explanation for climate change (humanity’s greenhouse gas emissions), it is by no means the only possible one. And any modeler who claims they have found the only possible cause of global warming is being either disingenuous, or they have let their faith overpower their ability to reason.

Even the IPCC (2007) admits there is a 10% chance that they are wrong about humans being responsible for most of the warming observed in the last 50 years. That, by itself, shows that anyone who says “the science is settled” doesn’t know what they are talking about.

Conclusion

There is no question that great progress has been made in climate modeling. I consider computer modeling to be an absolutely essential part of climate research. After all, without running numbers through physical equations in a theoretically-based model, you really can not claim that you understand very much about how climate works.

But given all of the remaining uncertainties, I do not believe we can determine — with any objective level of confidence — whether any of the current model projections of future warming can be believed. Any scientist who claims otherwise either has political or other non-scientific motivations, or they are simply being sloppy.


REFERENCES CITED:

Caldwell, P., and C. S. Bretherton, 2009. Response of a subtropical stratocumulus-capped mixed layer to climate and aerosol changes. Journal of Climate, 22, 20-38.

Compo, G.P., and P. D. Sardeshmukh, 2009. Oceanic influences on recent continental warming, Climate Dynamics, 32, 333-342.

Intergovernmental Panel on Climate Change, 2007. Climate Change 2007: The Physical Science Basis, report, 996 pp., Cambridge University Press, New York City.

Loehle, 2007. A 2,000 year global temperature reconstruction on non-treering proxy data. Energy & Environment, 18, 1049-1058.

Renno, N.O., K.A. Emanuel and P.H. Stone, 1994. A Radiative-convective model
with an explicit hydrologic cycle: 1.Formulation and sensitivity to model parameters.
Journal of Geophysical Research, 99, 14,429-14,441

Spencer, R.W., W. D. Braswell, J. R. Christy, and J. Hnilo, 2007. Cloud and radiation budget changes associated with tropical intraseasonal oscillations, Geophys. Res. Lett., 34, L15707 doi:10.1029/2007GL029698.

Spencer, R.W., and W.D. Braswell, 2008. Potential biases in cloud feedback diagnosis: A
simple model demonstration, J. Climate, 21, 5624-5628.

Trenberth, K.E.., and J.T. Fasullo, 2009. Global warming due to increasing absorbed solar radiation. Geophysical Research Letters, 36, L07706, doi:10.1029/2009GL037527.

Tsonis, A. A., K. Swanson, and S. Kravtsov, 2007. A new dynamical mechanism for major climate shifts. Geophysical Research Letters, 34, L13705, doi:10.1029/2007GL030288.

June 2009 Global Temperature Anomaly Update: 0.00 deg. C

July 3rd, 2009

(edit at 3:10 pm CDT: changed “tropics” to “Southern Hemisphere”)

YR MON GLOBE   NH   SH   TROPICS
2009   1   0.304   0.443   0.165   -0.036
2009   2   0.347   0.678   0.016   0.051
2009   3   0.206   0.310   0.103   -0.149
2009   4   0.090   0.124   0.056   -0.014
2009   5   0.045   0.046   0.044   -0.166
2009   6   0.001   0.032   -0.030   -0.003

1979-2009 Graph

June 2009 saw another — albeit small — drop in the global average temperature anomaly, from +0.04 deg. C in May to 0.00 deg. C in June, with the coolest anomaly (-0.03 deg. C) in the Southern Hemisphere. The decadal temperature trend for the period December 1978 through June 2009 remains at +0.13 deg. C per decade.

NOTE: A reminder for those who are monitoring the daily progress of global-average temperatures here:

(1) Only use channel 5 (“ch05”), which is what we use for the lower troposphere and middle troposphere temperature products.
(2) Compare the current month to the same calendar month from the previous year (which is already plotted for you).
(3) The progress of daily temperatures (the current month versus the same calendar month from one year ago) should only be used as a rough guide for how the current month is shaping up because they come from the AMSU instrument on the NOAA-15 satellite, which has a substantial diurnal drift in the local time of the orbit. Our ‘official’ results presented above, in contrast, are from AMSU on NASA’s Aqua satellite, which carries extra fuel to keep it in a stable orbit. Therefore, there is no diurnal drift adjustment needed in our official product.


Cap and Trade and the Illusion of the New Green Economy

July 1st, 2009

I don’t think Al Gore in his wildest dreams could have imagined how successful the “climate crisis” movement would become. It is probably safe to assume that this success is not so much the result of Gore’s charisma as it is humanity’s spiritual need to be involved in something transcendent – like saving the Earth.

After all, who wouldn’t want to Save the Earth? I certainly would. If I really believed that manmade global warming was a serious threat to life on Earth, I would be actively campaigning to ‘fix’ the problem.

But there are two practical problems with the theory of anthropogenic global warming: (1) global warming is (or at least was) likely to be a mostly natural process; and (2) even if global warming is manmade, it will be immensely difficult to avoid further warming without new energy technologies that do not currently exist.

On the first point, since the scientific evidence against global warming being anthropogenic is what most of the rest of this website is about, I won’t repeat it here. But on the second point…what if the alarmists are correct? What if humanity’s burning of fossil fuels really is causing global warming? What is the best path to follow to fix the problem?

Cap-and-Trade

The most popular solution today is carbon cap-and-trade legislation. The European Union has hands-on experience with cap-and-trade over the last couple of years, and it isn’t pretty. Over there it is called their Emissions Trading Scheme (ETS). Here in the U.S., the House of Representatives last Friday narrowly passed the Waxman-Markey bill. The Senate plans on taking up the bill as early as the fall of 2009.

Under cap-and-trade, the government institutes “caps” on how much carbon dioxide can be emitted, and then allows companies to “trade” carbon credits so that the market rewards those companies that find ways to produce less CO2. If a company ends up having more credits than they need, they can then sell those credits to other companies.

While it’s advertised as a “market-based” approach to pollution reduction, it really isn’t since the market did not freely choose cap-and-trade…it was imposed upon the market by the government. The ‘free market’ aspect of it just helps to reduce the economic damage done as a result of the government regulations.

The Free Market Makes Waxman-Markey Unnecessary

There are several serious problems with cap-and-trade. In the big picture, as Europe has found out, it will damage the economy. This is simply because there are as yet no large-scale, practical, and cost-competitive replacements for fossil fuels. As a result, if you punish fossil fuel use with either taxes or by capping how much energy is allowed to be used, you punish the economy.

Now, if you are under the illusion that cap-and-trade will result in the development of high-tech replacements for fossil fuels, you do not understand basic economics. No matter how badly you might want it, you can not legislate a time-travel machine into existence. Space-based solar power might sound really cool, but the cost of it would be astronomical (no pun intended), and it could only provide the tiniest fraction of our energy needs. Wind power goes away when the wind stops, and is only practical in windy parts of the country. Land-based solar power goes away when the sun sets, and is only practical in the sunny Southwest U.S. While I personally favor nuclear power, it takes forever to license and build a nuclear power plant, and it would take 1,000 1-gigawatt nuclear power plants to meet electricity demand in the United States.

And no one wants any of these facilities near where they live.

Fortunately, cap-and-trade legislation is not necessary anyway because incentives already exist – right now — for anyone to come up with alternative technologies for energy generation and energy efficiency. Taxpayers and consumers already pay for billions of dollars in both government research (through taxes) and private research (through the cost of goods and services) to develop new energy technologies.

Whoever succeeds in these efforts stands to make a lot of money simply because everything we do requires energy. And I do mean everything…even sitting there and thinking. Using your brain requires energy, which requires food, which requires fossil fuels to grow, distribute, refrigerate and cook that food.

Economic Competitiveness in the Global Marketplace

Secondly, when instituted unilaterally by a country, cap-and-trade legislation makes that country less competitive in the global economy. Imports and trade deficits increase as prices at home rise, while companies or whole industries close and move abroad to countries where they can be more competitive.

The Obama administration and congress are trying to minimize this problem by imposing tariffs on imports, but this then hurts everyone in all of the countries involved. Remember, two countries only willingly engage in trade with each other because it economically benefits both countries by reducing costs, thus raising the standard of living in those countries.

The Green Mafia

Third, cap-and-trade is a system that is just begging for cheating, bribing, and cooking the books. How will a company’s (or a farm’s) greenhouse gas emissions be gauged, and then monitored over time? A massive new bureaucracy will be required, with a litany of rules and procedures which have limited basis in science and previous experience.

And who will decide how many credits will initially be given by the government to each company/farm/industry? Does anyone expect that these decisions will be impartial, without political favoritism shown toward one company over another, or one industry over another? This is one reason why some high-profile corporations are now on the global warming bandwagon. They (or at least a few of their executives) are trying to position themselves more favorably in what they see to be an inevitable energy-rationed economic system.

Big Oil and Big Coal Will Not Pay for Cap-and-Trade

Fourth, it is the consumer – the citizen – who will pay for all of this, either in the form of higher prices, or reduced availability, or reduced economic growth. Companies have no choice but to pass increased costs on to consumers, and decreased profits to investors. You might think that “Big Business” will finally be paying their “fair share”, but Big Business is what provides jobs. No Big Business, no jobs.

The Green Jobs Illusion

Fifth, the allure of “green jobs” might be strong, but the economic benefit of those jobs is an illusion. The claim that many thousands of new green jobs will be created under such a system is probably true. But achieving low unemployment through government mandates does not create wealth – it destroys wealth.

Let me illustrate. We could have full employment with green jobs today if we wanted to. We could pay each other to dig holes in the ground and then fill the holes up again, day after day, month after month. (Of course, we’ll use shovels rather than backhoes to reduce fossil fuel use.) How’s that for a green jobs program?

My point is that it matters a LOT what kinds of jobs are created. Let’s say that today 1,000 jobs are required to create 1 gigawatt of coal-fired electricity. Now, suppose we require that electricity to come from a renewable source instead. If 5,000 jobs are needed to create the same amount of electricity with windmills that 1,000 jobs created with coal, then efficiency and wealth generation will be destroyed.

Sure, you can create as many green jobs as you want, but the comparative productivity of those jobs is what really matters. In the end, when the government manipulates the economy in such a fashion, the economy suffers.

And even if a market for green equipment (solar panels, windmills, etc.) does develop, there is little doubt that countries like China will be able to manufacture that equipment at lower cost than the United States. Especially considering all of our laws, regulations, limits, and restrictions.

So, What’s the Alternative?

If anthropogenic global warming does end up being a serious problem, then what can be done to move away from fossil fuels? I would say: Encourage economic growth, and burn fossil fuels like there is no tomorrow! Increased demand will lead to higher prices, and as long as the free market is allowed to work, new energy technologies will be developed.

As long a demand exists for energy (and it always will), there will be people who find ways to meet that demand. There is no need for silly awards for best inventions, etc., because the market value of those inventions will far exceed the value of any gimmicky, government-sponsored competitions.

Why are Politicians so Enamored by Cap-and-Trade?

Given the pain (and public backlash) the EU has experienced from two years’ experience with its Emissions Trading Scheme, why would our politicians ignore that foreign experience, as well as popular sentiment against cap-and-trade here at home, and run full-steam with eyes closed into this regulatory quagmire?

The only answer I can come up with is: more money and more power for government. As a former government employee, I am familiar with the mindset. While the goal of a private sector job is to create wealth, the government employee’s main job is to spend as much of that wealth as possible. A government agency’s foremost goal is self preservation, which means perpetuating a public need for the agency. The idea that our government exists to help enable a better life for its citizens might have been true 100 years ago, but today it is hopelessly naďve.

All Pain, No Gain

And finally, let’s remember what the whole purpose of carbon cap-and-trade is: to reduce future warming of the climate system. Even some prominent environmentalists are against Waxman-Markey because they do not believe it will substantially reduce carbon dioxide emissions here at home. To the extent that provisions are added to the bill to make it more palatable to politicians from agricultural states or industrial states, it then accomplishes even less of what it is intended to accomplish: reductions in carbon dioxide emissions.

And even if cap-and-trade does what is intended, the reduction in CO2 emissions as a fraction of global CO2 emissions will moderate future warming by, at most, around one tenth of a degree C by late in this century. That is probably not even measurable.

Of course, this whole discussion assumes that the climate system is very sensitive to our carbon dioxide emissions. But if the research we are doing is correct, then manmade global warming is being overestimated by about a factor of 5, and it is the climate system itself that causes climate change…not humans.

If that is the case, then nothing humanity does is going to substantially affect climate one way or the other. Indeed, given the fact that life on Earth depends upon the tiny amount of CO2 in the atmosphere, I continue to predict that more atmospheric CO2 will, in the end, be a good thing for life on Earth.

Yet, many politicians are so blinded by the additional political power and tax revenue that will come from a cap-and-trade system that they do not want to hear any good news from the science. For instance, in my most recent congressional testimony, the good news I presented was met with an ad hominem insult from Senator Barbara Boxer.

I can only conclude that some politicians actually want global warming to be a serious threat to humanity. I wonder why?


EPA Endangerment Finding: My Submitted Comments

June 23rd, 2009

1. ISSUE SUMMARY

1.1 Evidence that Climate Models Produce Far Too Much Warming

The issue I will address is appropriately introduced by a quote from a leading cloud expert, Dr. Robert Cess, made over ten years ago:

the [climate models] may be agreeing now simply because they’re all tending to do the same thing wrong. It’s not clear to me that we have clouds right by any stretch of the imagination.
– Dr. Robert Cess, quoted in Science (May 16, 1997, p. 1040)

Nowhere in climate models is there greater potential for error than in the treatment of clouds. This is especially true of low clouds, which cool the climate system, and which the IPCC has admitted are the largest source of uncertainty in global warming projections (IPCC, 2007).

Research published by us since the IPCC 2007 4th Assessment Report (IPCC AR4) suggests that a major problem exists with most, if not all, of the IPCC models’ cloud parameterizations. Cloud parameterizations are greatly simplified methods for creating clouds in climate models. Their simplicity is necessary since the processes controlling clouds are too complex to include in climate models, and yet those same parameterizations are critical to model projections of future global temperatures and climate since clouds determine how much sunlight is allowed into the climate system. Significantly, all 21 IPCC climate models today decrease global average cloud cover in response to any warming influence, such as that from anthropogenic carbon dioxide emissions, thus amplifying the small, direct warming effect of more CO2.

In stark contrast, though, new analyses of our latest and best NASA satellite data suggest that the real climate system behaves in exactly the opposite manner. This error, by itself, could mean that future warming projected by these models has been overstated by anywhere from a factor of 2 to 6.

How could such a serious error be made by so many climate experts? In a nutshell, when previous researchers have looked at how clouds and temperature have varied together in the real climate system, they have assumed that the observed temperature changes caused the observed cloud changes – but not the other way around.

As I will demonstrate, by assuming causation in only one direction they have biased their interpretation of cloud behavior in the direction of positive feedback (that is, high climate sensitivity). The existence of the problem was first published by us 1 November, 2008 in Journal of Climate (Spencer and Braswell, 2008). It supported our previously published research which showed that weather systems in the tropics also behave in the opposite manner as do climate models (Spencer et al., 2007), that is, they reduce any warming influence rather than magnify it.

My claim on the direction of causation is more recently supported by evidence that one can distinguish cause from effect under certain conditions, in both satellite observations of the real climate system, and in climate models themselves. The result is that true negative feedback in the climate system has been obscured by the dominating influence of natural cloud changes causing temperature changes, which has produced the illusion of a sensitive climate system.

1.2 Natural Cloud Variations Might Have Caused “Global Warming”

If feedbacks in the climate system are indeed negative rather than positive, not only does this mean anthropogenic global warming might well be lost in the noise of natural climate variability, it also means that CO2 emissions have been insufficient to cause most of the warming seen in the past 50 years or so. Just as researchers have been misled about climate sensitivity by ignoring clouds-causing-temperature change, they have also neglected natural cloud variations as a potential source of climate change itself.

While the IPCC claims they can only explain late 20th Century warming when they include anthropogenic CO2 emissions in the models, this is mostly because sufficiently accurate long-term global cloud observations simply do not exist with which one might look for natural sources of climate change. A persistent change of only 1% or 2% in global-average low cloud cover would be sufficient to cause global warming – or cooling. Our ability to measure long-term cloud changes to this level of precision has existed only since the launch of NASA’s Terra satellite in 2000, and so it is not possible to directly determine whether there are natural cloud changes that might have caused most of the climate variability seen over the last 50 to 100 years.

But even though such long-term observations do not exist, one could instead study known natural climate indices such as the Pacific Decadal Oscillation (PDO, Mantua et al., 1997) during that 10 year period to look for evidence that known natural modes of climate variability modulate global average cloud cover. This kind of research should be required before one even begins to discuss ruling out natural climate variability as a source of climate change. Unfortunately, this type of research has never been performed by anyone.

The failure to sufficiently investigate natural, internal modes of climate variability by the climate research and modeling community is a major failing of the IPCC and the CCSP processes. Under the Federal Information Quality Act, the U.S. science process must be held to a higher standard of objectivity and utility.

The discussion below demonstrates why EPA cannot use either the IPCC or CCSP conclusions as a basis for its Endangerment Finding since both depend on the same flawed models.

2. SPECIFIC ERRORS IN THE EF/TSD

I will be addressing the following endangerment finding (EF) and technical support document (TSD) statements, specifically challenging the portions in bold italics, below.

Endangerment Finding:

EF-18896.2: Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations. Global observed temperatures over the last century can be reproduced only when model simulations include both natural and anthropogenic forcings, that is, simulations that remove anthropogenic forcings are unable to reproduce observed temperature changes. Thus, most of the warming cannot be explained by natural variability, such as variations in solar activity.

TSD Executive Summary:

Observed Effects Associated with Global Elevated Concentrations of GHGs:

[OE 2] The global average net effect of the increase in atmospheric GHG concentrations, plus other human activities (e.g., land use change and aerosol emissions), on the global energy balance since 1750 has been one of warming. This total net heating effect, referred to as forcing, is estimated to be +1.6 (+0.6 to +2.4) Watts per square meter (W/m2), with much of the range surrounding this estimate due to uncertainties about the cooling and warming effects of aerosols. The combined radiative forcing due to the cumulative (i.e., 1750 to 2005) increase in atmospheric concentrations of CO2, CH4, and N2O is estimated to be +2.30 (+2.07 to +2.53) W/m2. The rate of increase in positive radiative forcing due to these three GHGs during the industrial era is very likely to have been unprecedented in more than 10,000 years.

[OE 4] Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations. Climate model simulations suggest natural forcing alone (e.g., changes in solar irradiance) cannot explain the observed warming.

Projections of Future Climate Change with Continued Increases in Elevated GHG Concentrations:

[PF 2] Future warming over the course of the 21st century, even under scenarios of low emissions growth, is very likely to be greater than observed warming over the past century.

[PF 3] All of the U.S. is very likely to warm during this century, and most areas of the U.S. are expected to warm by more than the global average.

2.1 Comments

What follows is evidence against the familiar IPCC AR4 claim, which also appears in both the EF and TSD:

Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.

As we will see, this claim is premature at best. I begin with a discussion of feedbacks (which determine climate sensitivity) because the IPCC’s belief in a sensitive climate system is central to their claim that global warming is mostly anthropogenic, and not natural.

2.1.1 The Importance of Climate Sensitivity to Demonstrating Causation in Global Warming

The central issue of causation in global warming is closely related to ‘climate sensitivity’, which can be defined as the amount of warming the Earth experiences in response to a radiative forcing (global average imbalance in sunlight gained versus thermally emitted infrared radiation lost).

If climate sensitivity is relatively high, as the IPCC claims, then I agree that anthropogenic greenhouse gas emissions might well be the main reason for most of the warming experienced in the last 50 years. This is because the small amount of radiative forcing from anthropogenic greenhouse gases would be sufficient to explain past warming if that small warming is amplified by positive feedback, which is what produces high climate sensitivity.

But if climate sensitivity is low, then anthropogenic emissions would be too weak to cause substantial warming. Some stronger, natural mechanism would need to be mostly responsible for the warming we have experienced. Low climate sensitivity would additionally mean that any source of anthropogenic forcings (greenhouse gases, aerosols, etc.) would not substantially affect climate, and therefore a reduction in emissions would have little effect on climate.

This is partly why the IPCC is not motivated to find natural sources of climate change. If the climate system is quite sensitive, then the extra carbon dioxide in the atmosphere alone is sufficient to explain past warming.

So, what determines feedbacks, and thus climate sensitivity? Simply put, climate sensitivity depends upon whether clouds (and other elements of the climate system) respond to the small amount of warming caused by the 1+ W/m2 radiative forcing from anthropogenic greenhouse gases by either amplifying it through ‘positive feedbacks’, or by reducing it through ‘negative feedbacks’. Depending upon climate sensitivity, the long-term warming from increasing atmospheric greenhouse gas concentrations could theoretically be anywhere from unmeasurable to catastrophic. Climate sensitivity thus becomes the main determinant of the level of future anthropogenic global warming, and how it then compares in magnitude to natural sources of climate variability.

The IPCC has admitted that feedbacks from low clouds (which have a large impact on how much sunlight reaches the Earth’s surface) are the most uncertain of all the feedbacks that determine climate sensitivity, and therefore constitute the largest source of uncertainty in projections of future global warming (IPCC, 2007). Indeed, Trenberth & Fasullo (2009) found that most of the differences between the IPCC climate models’ warming projections can be traced to how they change low and middle cloud cover with warming. Recent work by Caldwell and Bretherton (2009) with a sophisticated cloud resolving model has produced results supporting negative, not positive, low cloud feedback.

I now believe that the low cloud feedback issue is even more serious than the IPCC has admitted. Next we will examine why I believe there has been so much uncertainty over cloud feedback in the climate system.

2.1.2 Why Previous Observational Estimates of Climate Sensitivity have been Inconclusive

Previous estimates of climate sensitivity from observational data have led to confusing and inconclusive results (Knutti and Hergerl, 2008). The most recently published satellite estimates of climate sensitivity (Forster and Gregory, 2006, hereafter FG06) and IPCC AR4 climate model sensitivity (Forster and Taylor 2006, hereafter FT06) led FT06 to conclude that the satellite-based results could not be trusted. At face value, their best estimate of sensitivity from satellite observations was less sensitive then all of the IPCC models, but they concluded that the uncertainty in the satellite estimate was so large that it was of little value anyway.

But we have determined that the reason researchers have been unable to pin down a reasonable estimate of climate sensitivity is because cause and effect have not been accounted for when measuring the co-variations between cloud cover (or radiative flux) and temperature. We have evidence that the true signature of feedback in the data has been mostly obscured by natural cloud variations forcing temperature variations.

The problem can be illustrated with the following example: If global average cloudiness is observed to decrease in warmer years, this would normally be interpreted as warming causing a cloud decrease, which would be positive feedback, which would mean higher climate sensitivity. But what if the warming was the result of the decrease in cloud coverage, rather than the cause of it?

As demonstrated by Spencer and Braswell (2008) with a simple model of global-average climate, the presence of something as simple as daily random variations in clouds will cause feedbacks diagnosed from measurements of temperature and cloudiness (actually, the radiative imbalance of the Earth) to be biased in the direction of positive feedback (high climate sensitivity). That paper was reviewed by two IPCC experts: Piers Forster, and Isaac Held, who both agreed our paper raised a valid and potentially important issue.

What follows is quantitative evidence that suggests why this mix-up between cause and effect causes the illusion of high climate sensitivity. (The paper describing this evidence is under revision to be resubmitted to Journal of Geophysical Research after we address questions raised by the three reviewers of the paper). The causation issue can be illustrated with the following graph of 7.5 years of recent satellite-measured global ocean average variations in tropospheric temperature versus top-of-atmosphere total radiative flux (‘LW’ is emitted infrared, ‘SW’ is reflected sunlight). Both of these datasets are publicly available.

epa_fig01
Fig. 1. Global oceanic 3-month averages of net radiative flux variations (reflected sunlight + thermally emitted infrared) from the CERES radiation budget instrument flying on NASA’s Terra satellite, plotted against corresponding tropospheric temperature variations estimated from channel 5 of the Advanced Microwave Sounding Unit (AMSU) flying on the NOAA-15 satellite.

The slope of the dashed line, fit to the data with statistical ‘regression’, would traditionally be assumed to provide an estimate of feedback, and therefore of climate sensitivity. Regression line slopes are what FG06 used to diagnose feedbacks in satellite data, and what FT06 used to diagnose feedbacks in climate models. The greater the slope of the regression line, the less sensitive the climate system; the shallower the slope of the line, the greater the climate sensitivity. Since a line slope of 3.3 Watts per sq. meter per degree C represents the border between positive and negative feedback, the slope of the line in Fig. 1 (1.9 Watts per sq. meter per deg. C) would, at face value, correspond to moderate positive feedback.

But note the huge amount of scatter in the data. This is an example of why researchers have not trusted previous satellite estimates of feedback. If the data points happened to cluster nicely along a line, then we would have more confidence in the diagnosed feedback. But instead we see data scattered all over. This scatter translates directly into uncertainty in the slope of the line, which means uncertainty in feedback, which means uncertainty in climate sensitivity.

This is the point at which other researchers have stopped in their analysis of the satellite data, concluding that satellite estimates of feedbacks are too uncertain to be of much use. But we have determined that researchers have not dug deep enough in their data analysis. It turns out that the scatter in the data is mostly the result of cloud variations causing temperature variations, which is causation in the opposite direction as feedback. If the data in Fig. 1 are plotted as running averages, rather than independent averages, and the successive data points in time are connected by lines, certain patterns begin to emerge, as is shown in Fig. 2.

epa_fig02
Fig. 2. As in Fig 1, but now 3-month averages are computed every day and connected by lines, revealing the time history of how the climate system evolves.

This method of plotting is called phase space analysis, and it can “easily elucidate qualities of (a) system that might not be obvious otherwise” (Wikipedia.com entry on “Phase Space”). What we now see instead of a seemingly random scatter of points is a series of linear striations and looping or spiraling patterns.

A very simple forcing-feedback model widely used in climate studies (e.g. Spencer and Braswell, 2008) can be used to show that the linear features are temperature changes causing cloud-induced radiative changes (that is, feedback); while the looping features are from causation in the opposite direction: cloud variations causing temperature variations.

Significantly, it is the natural cloud variations causing temperature variations that de-correlates the data, leading to a regression line slope biased in the direction of high climate sensitivity (positive feedback) like that seen in Fig. 1.

We also find the spiral features and linear features in IPCC climate models tracked by the IPCC, for instance in the GFDL CM2.1 model shown in Fig. 3.

epa_fig03
Fig. 3. As in Fig. 2, but for yearly global averages plotted every month from the GFDL CM2.1 climate model. The dashed line is a regression fit to the data; the slope of the solid line represents the model’s long-term feedback in response to anthropogenic radiative forcing from greenhouse gases as diagnosed by FT06.

It is important to note that the linear striations in Fig. 3 are approximately parallel to the ‘true’ long-term feedback as diagnosed for this model by FT06, which is indicated by the solid line. This means that the slope of the short-term linear striations are indeed an indication of the long-term feedback in the model, and therefore of the climate sensitivity of that model. We find obvious striations (‘feedback stripes’) in five of the IPCC climate models, and in all five cases their average slope is very close to the long-term feedback in those models diagnosed by FT06.

While the above analysis might seem a little technical, it is merely a way to quantitatively demonstrate how a mix-up between cause and effect between clouds and temperature can lead to the illusion of a sensitive climate system.

2.1.3 Feedbacks Revealed In Satellite Data

Using this new insight, if we now return to the linear striations seen in the satellite data plotted in Fig. 2, we find their slope to be about 6 Watts per sq. meter per degree C, which would correspond to strongly negative feedback. This is about the same feedback value that Spencer et al. (2007) found for a composite of tropical weather systems over a multi-year period. If this is the feedback operating in the real climate system on the long time scales involved with manmade global warming, then the amount of warming from a doubling of carbon dioxide would only be about 0.6 deg. C, which is at least a factor of 4 less than the IPCC’s best estimate for the future. This casts serious doubt upon the projections of future climate change mentioned in the EF and TSD:

[PF 2] Future warming over the course of the 21stCentury, even under scenarios of low emissions growth, is very likely to be greater than observed warming over the past century.

Similarly, all projections of substantial future regional climate change, such as:

[PF 3] All of the U.S. is very likely to warm during this century, and most areas of the U.S. are expected to warm by more than the global average.

also depend upon high climate sensitivity, and so are similarly called into question.

As far as I know, we are the only research group performing this kind of research. I believe that much greater exploitation our satellite data resources is required to understand what the climate system is trying to tell us about climate sensitivity before we can place any level of confidence in the climate model projections relied upon by the IPCC, and thus by the EPA. The above evidence suggests that previous tests of climate models with observational data have not been sufficiently detailed to validate the feedbacks (climate sensitivity) in those models. At a minimum, the models need to be adjusted to mimic the behavior seen in the real climate system, such as that described above, with methods (e.g. phase space analysis) that can reveal the separate signatures of temperature-forcing-clouds (feedback) from clouds-forcing-temperature (internal radiative forcing).

This is an important point, and it is worth repeating: none of the previous comparisons of climate model output to satellite data have been sufficient to test the climate sensitivity of those models. Unless one accounts in some way for the direction of causation when comparing temperature variations to cloud (or radiative flux) variations, any observational estimates of feedback are likely to be spuriously biased in the direction of high climate sensitivity.

2.1.4 The Potential Role of Natural Cloud variations in Causing Climate Change

The existence of negative feedback in the climate system would have two important consequences: (1) future anthropogenic climate change can be expected to be small, possibly even unmeasurable in the face of natural climate variability; and (2) increasing greenhouse gas concentrations are insufficient to have caused past warming of the climate system. One or more natural sources of climate change would need to be involved.

And this is where natural cloud variations once again enter the picture. While the IPCC only mentions “external” forcings (radiative imbalances) on the climate system due to volcanoes, anthropogenic pollution, and output from the sun, it is also possible for “internal” forcings such as natural cloud changes to cause climate change. This could come about simply through small natural changes in the general circulation of the ocean and atmosphere, for instance from the Pacific Decadal Oscillation (Mantua et al., 1997), or other known (or even unknown) modes of natural climate variability. Spencer and Braswell (2008) showed that even daily random cloud variations over the ocean can lead to substantial decadal time scale variability in ocean temperatures.

It is critical to understand that the IPCC assumes, either explicitly or implicitly, that such natural cloud variations do not occur on the time scales involved in climate change. Quoting EF-18896.2,

Global observed temperatures over the last century can be reproduced only when model simulations include both natural and anthropogenic forcings, that is, simulations that remove anthropogenic forcings are unable to reproduce observed temperature changes. Thus, most of the warming cannot be explained by natural variability, such as variations in solar activity.

This statement misleadingly implies that the IPCC knows all of the important natural sources of climate change – but they do not. The IPCC specifically ignores any potential internally-generated sources of climate change…what the public would simply call “natural cycles” in climate. While they do not mention it, the IPCC can do this because we do not have global measurements over a long enough period of time to measure small changes in global average cloud cover. All it would take is 1% or 2% fluctuations to cause the kinds of variability in global temperatures exhibited by the following plot (Fig. 4) of global temperature proxy measurements over the last 2,000 years (Loehle, 2007).

epa_fig041
Fig. 4. Non-treering proxy reconstruction of global temperature variations over the past 2,000 years (Loehle, 2007).

In fact, if this reconstruction of past temperature variations is anywhere close to being realistic it suggests there is no such thing as “average climate”. As can be seen, substantial temperature changes on 50 to 100 year time scales such as what was observed over the 20th Century have been the rule, not the exception. Periods of steady temperatures are actually quite unusual.

The cause of such natural changes is still unknown to science. Some will argue sunspot activity has modulated global cloud amounts, which is one possibility for external forcing. But another possibility, as alluded to above, is that the climate system causes its own climate change. For instance, we know that “chaos” happens in weather (e.g. Lorenz, 1963), the result of complex nonlinear interactions within, and between, weather systems. But given the much longer (e.g. decadal to centennial) timescales involved in the ocean circulation, there is no reason why chaotic variations can not also occur in climate (e.g. Tsonis et al., 2007).

Therefore, the claim by the IPCC that warming can only be produced by climate models when anthropogenic greenhouse gases are included misleadingly implies that climate does not change naturally. But this is merely an assumption enabled by a lack of data to demonstrate otherwise – not an inference from analysis of existing data.

And again, the easiest way for such changes to occur would be for small changes in atmospheric and oceanic circulation systems to cause small changes in global average cloud cover. For instance, if a small decrease in cloud cover occurs over the ocean, then the ocean will respond by warming. This will, in turn, cause warming and humidifying of oceanic air masses. These warmer and moister air masses then flow across the continents, where even greater warming can result from the natural greenhouse effect of more water vapor contained in the air masses. This was recently demonstrated by Compo and Sardeshmukh (2009), and it demonstrates that “global warming” can indeed be generated naturally. Because of this, there probably is no reliable ‘fingerprint’ of anthropogenic climate change.

Therefore, the TSD statement that “The rate of increase in positive radiative forcing due to these three GHGs during the industrial era is very likely to have been unprecedented in more than 10,000 years”, is nothing more than a statement of faith, and completely ignores the possibility that there have been much larger natural changes in greenhouse gases in the past — specifically, in water vapor…the Earth’s main greenhouse gas.

Again, past changes in atmospheric temperature and water vapor – and thus in the Earth’s greenhouse effect – can be caused by natural changes in oceanic cloud cover modulating the amount of sunlight that is absorbed by the ocean. This possibility is ignored by the IPCC, who simply assume that the climate system was in a perpetual state of radiative energy balance until humans came along and upset that balance.

3. FINAL REMARKS

As we have seen, there is considerable evidence – both theoretical and observational — to distrust the IPCC’s claim that global warming is mostly anthropogenic in origin. The work described above — some published in the peer reviewed scientific literature by us and by others, some in the process of being published — strongly suggests the IPCC AR4 climate models produce too much global warming, possibly by a wide margin.

As discussed above, the most important reason why climate models are too sensitive is, in my view, due to the neglect of the effect that natural cloud changes in the climate system have on (1) climate sensitivity estimates, and on (2) past climate change, such as the global-average warming which occurred in the 20th Century.

The failure to investigate all natural processes by the climate research and modeling communities is a major failing of the IPCC and the CCSP processes. Under the Federal Information Quality Act, the U.S. science process must be held to higher standards of objectivity and utility. Clearly, there is as yet little scientific basis for making an “Endangerment Finding” related to carbon dioxide.


REFERENCES CITED

Caldwell, P., and C. S. Bretherton, 2009. Response of a subtropical stratocumulus-capped mixed layer to climate and aerosol changes. Journal of Climate, 22, 20-38.

Compo, G.P., and P. D. Sardeshmukh, 2009. Oceanic influences on recent continental warming, Climate Dynamics, 32, 333-342.

Forster, P. M., and J. M. Gregory, 2006. The climate sensitivity and its components diagnosed from Earth Radiation Budget data, J. Climate, 19, 39-52.

Forster, P.M., and K.E. Taylor, 2006. Climate forcings and climate sensitivities diagnosed from coupled climate model integrations, J. Climate, 19, 6181-6194.

Intergovernmental Panel on Climate Change, 2007. Climate Change 2007: The Physical Science Basis, report, 996 pp., Cambridge University Press, New York City.

Knutti, R., and G. C. Hegerl, 2008. The equilibrium sensitivity of the Earth’s temperature to radiation changes,” Nature Geoscience, 1, 735-743.

Loehle, 2007. A 2,000 year global temperature reconstruction on non-treering proxy data. Energy & Environment, 18, 1049-1058.

Lorenz, E.N., 1963: Deterministic non-periodic flow, Journal of the Atmospheric Sciences, 20, 130–141.

Mantua, N. J., S.R. Hare, Y. Zhang, J.M. Wallace, and R.C. Francis, 1997: A Pacific interdecadal climate oscillation with impacts on salmon production. Bulletin of the American Meteorological Society, 78, 1069-1079.

Spencer, R.W., W. D. Braswell, J. R. Christy, and J. Hnilo, 2007. Cloud and radiation budget changes associated with tropical intraseasonal oscillations, Geophys. Res. Lett., 34, L15707 doi:10.1029/2007GL029698.

Spencer, R.W., and W.D. Braswell, 2008. Potential biases in cloud feedback diagnosis: A simple model demonstration, J. Climate, 21, 5624-5628.

Trenberth, K.E.., and J.T. Fasullo, 2009. Global warming due to increasing absorbed solar radiation. Geophysical Research Letters, 36, L07706, doi:10.1029/2009GL037527.

Tsonis, A. A., K. Swanson, and S. Kravtsov, 2007. A new dynamical mechanism for major climate shifts. Geophysical Research Letters, 34, L13705, doi:10.1029/2007GL030288.

Ice Ages or 20th Century Warming, It All Comes Down to Causation

June 17th, 2009

Musings on the Vostok Ice Core Record

(edited 11 July 2009 for clarity in discussion of lead-lag relationship between Vostok temperature and CO2)

Since I get asked so often what I think of the Vostok ice core data that James Hansen uses as ‘proof’ that CO2 drives temperature, I decided to spend a few days analyzing that data, as well as the Milankovitch forcings calculated by Huybers and Denton (2008) which (allegedly) helps drive the global temperature variations seen in the Vostok Data.

The following graph shows most of the 400,000 year Vostok record of retrieved temperature and CO2 content. Assuming that these estimates really are what they are claimed to be, what can they tell us — if anything — about the role of carbon dioxide in climate change?

vostok-co2-and-temperature

First off you need to realize there has been a long history of debate in the climate community about the 3 Milankovitch cycles in solar forcing being the driving force for the Ice Ages. As far as I can discern, there have been at least three main problems with the Milankovitch theory.

First, the huge Vostok cycle at about 100,000 years remain unexplained because the Milankovitch cycles show no amplification for those events, just a regular series of small forcings which tend to be correlated with the smaller bumps on the curves of Vostok temperature and CO2 in the above graph.

A second problem has been that the positive correlation with Vostok (at the South Pole) came from NORTHERN Hemispheric forcing, not from the Southern Hemisphere. In the south, the Milankovitch forcings were out of phase with the temperature response in the Vostok ice core record. This has presented the problem of how Northern Hemispheric changes in solar forcing could cause such huge changes in Antarctic climate.

The third problem is that the Milankovitch forcings are so small it was difficult to see how they could cause even the smaller temperature changes in the ice core record – UNLESS climate sensitivity is very high (that is, feedbacks are strongly positive). This appears to be James Hansen’s view: the small Milankovitch forcings caused small increases in temperature, which in turn caused increases in carbon dioxide, which then took over as the forcing mechanism.

The recent paper by Huybers and Denton claims to have alleviated the 2nd and 3rd problems. If one assumes it is the length of summer in the Southern Hemisphere (rather than average yearly solar intensity) which is the main driving force for changing the Antarctic ice sheet, and also accounts for the fact that it takes many thousands of years for the ice sheet (and thus the temperature) to change in response to that summertime forcing, then the Southern Hemisphere Milankovitch cycles really do line up pretty well in time with the smaller Vostok events (but still not the huge 100,000 year events), and the “forcing” involved also becomes larger.

The Role of CO2 in the Vostok Record

So, where does CO2 fit into all of this? Well, at face value Hansen’s theory does require that temperature drives CO2…at least to begin with. Hansen claims the Milankovitch cycle in Earth-sun geometry caused a small temperature change, which then led to a CO2 change, and then the CO2 took over as the forcing. Or, maybe you could say the CO2 provided a strong positive feedback mechanism on temperature.

But it has often been noted that the Vostok CO2 record lags the temperature record by an average of 800 years, which is somewhat of a problem for Hansen’s theory. After all, if CO2 “took over” as the main driver of temperature, why do the CO2 changes tend to come after the temperature changes? But then there have also been uncertainties in dating the CO2 record due to assumptions that have to be made about how far and how fast the CO2 migrates through the ice core, giving the appearance of a different age for the CO2. So the lead-lag relationship still has some uncertainty associated with it.

But even if the CO2 and the temperature record lined up perfectly, would that mean Hansen is correct? No. It all hinges on the assumption that there was no forcing other than CO2 that caused the temperature changes. Hansen’s theory still requires that the temperature variations cause the CO2 changes to begin with, which begs the question: What started the temperature change? And what if the mechanism that started the temperature change is actually responsible for most of the temperature change? In that case, the climate sensitivity inferred from the co-variations between temperature and CO2 becomes less. The fact that even the recent work of Huybers and Denton does not address the major question of what caused the huge 100,000 year cycle in the Vostok data suggests there might be a forcing mechanism that we still don’t know about.

If Hansen is correct, and CO2 is the main forcing in the Vostok record, then it takes only about 10 ppm increase in CO2 to cause 1 degree C temperature change. The full range of CO2 forcing in the Vostok record amounts to 1.6 to 2 Watts per sq. meter, and if that caused the full range of temperature variations, then today we still have as much as 10 deg. C of warming “in the pipeline” from the CO2 we have put in the atmosphere from fossil fuel burning. That’s because 1.6 Watts per sq. meter is about the same amount of manmade forcing that supposedly exists in today’s atmosphere.

But if some other forcing is responsible for the temperature change, that necessarily implies lower climate sensitivity. And the greater the unknown forcing involved, the smaller the climate sensitivity you will calculate.

The Central Question of Causation

I believe that the interpretation of the Vostok ice core record of temperature and CO2 variations has the same problem that the interpretation of warming and CO2 increase in the last century has: CAUSATION. In both cases, Hansen’s (and others’) inference of high climate sensitivity (which would translate into lots of future manmade warming) depends critically on there not being another mechanism causing most of the temperature variations. If most of the warming in the last 100 years was due to CO2, then that (arguably) implies a moderately sensitive climate. If it caused the temperature variations in the ice core record, it implies a catastrophically sensitive climate.

But the implicit assumption that science knows what the forcings were of past climate change even 50 years ago, let alone 100,000 years ago, strikes me as hubris. In contrast to the “consensus view” of the IPCC that only “external” forcing events like volcanoes, changes in solar output, and human pollution can cause climate change, forcing of temperature change can also be generated internally. I believe this largely explains what we have seen for climate variability on all time scales. A change in atmospheric and oceanic circulation patterns could easily accomplish this with a small change in low cloud cover over the ocean. In simple terms, global warming might well be mostly the result of a natural cycle.

The IPCC simply assumes this internally-generated forcing of climate never occurs. If they did, they would have to admit they have no clue how much warming in the last 50 years is natural versus anthropogenic.

Fears of substantial manmade warming are critically dependent upon the assumption that the climate system never changes all by itself, naturally, or that there were not other external forcing mechanisms at work we are not aware of. Given the complex nonlinear behavior of the climate system, it seems to me that the assumption that things like global average low cloud cover always stays the same is unwarranted. In the end, in scientific research it’s the scientific assumptions that come back to bite you.

Where Does this Leave Carbon Dioxide?

About the only thing that seems like a safe inference from the Vostok record is that temperature drives CO2 variations – even Hansen’s theory requires that much — but the view that CO2 then caused most of the temperature variations seems exceedingly speculative.

And, if you are thinking of using the Vostok record to support warming driving today’s CO2 increase, you can forget it. In the Vostok record it amounts to only about 8 to 10 ppm per degree C, whereas our ~1 deg. C warming in the last 100 years has been accompanied by about 10 times that much CO2 increase.

The bottom line is that fears of substantial manmade climate change are ultimately based upon the assumption that we know what caused past climate change. For if past climate changes were caused by tiny forcings (too tiny for us to know about), then the climate system is very sensitive. Of course, those forcings might have been quite large – even self-imposed by the climate system itself — and we still might not know about them.

Finally, as a side note, it is interesting that the modern belief that our carbon emissions have caused the climate system to rebel are not that different from ancient civilizations that made sacrifices to the gods of nature in their attempts to get nature to cooperate. Technology might have changed over time, but it seems human nature has remained the same.


IPCC Admits they Could be Wrong about Humans Causing Global Warming

June 13th, 2009

If that headline surprises you, then it is a good indication of how successful the United Nations Intergovernmental Panel on Climate Change (IPCC) has been in their 20-year long effort to pin the rap for global warming on humanity. I’m not reporting anything really new here. I’m just stating what is logically consistent with, and a necessary inference from, one of the most recognizable claims contained in the Summary for Policymakers in the IPCC’s 2007 report:

Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.

They assign a probablility of 90% to the term “very likely”. This is a curious use of statistical probability since warming over the last 50 years is either mostly due to anthropogenic emissions, or it isn’t. Probabilities do not apply to past events like this.

What the IPCC is really using probabilities for is to attach some scientific value to their level of faith. I would be very surprised if there weren’t vigorous objections to the use of probabilities in this way from members of the IPCC…but the IPCC leadership really needed a scientific-sounding way to help push their political agenda, so I’m sure any objections were overruled.

But I digress. My main point here is that the IPCC is admitting that they might be wrong. That doesn’t sound to me like “the science is settled”, as Al Gore is fond of saying. If it is “very likely” that “most” of the observed warming was due to mankind, then they are admitting that it is possible that the warming was mostly natural, instead.

So, let’s play along with their little probability game. Given the extreme cost of greatly reducing our greenhouse gas emissions, wouldn’t you say that it would be important to actively investigate the 10% possibility that warming is mostly natural, as the IPCC readily admits?

Where is the 10% of government research dollars looking into this possibility? I suspect it is going to environmental NGO’s who are finding new ways to package “global warming” so that it doesn’t sound like a liberal issue. Or maybe they are working on more clever names to call researchers like me other than “deniers”, which is getting a little tiresome.

For many years the Department of Defense has had “Red Team” reviews devoted to finding holes in the “consensus of opinion” on weapons systems that cost a whole lot less than punishing the use of our most abundant and affordable sources of energy. It seems like a no-brainer that you would do something similar for something this expensive, and as destructive to the lives of poor people all over the world.

One could almost get the impression that there is more than just science that determines what climate science gets funded, and how it gets reported by the news media. Oh, that’s right I forgot…the United Nations is in charge of this effort. Well, I’m sure they know what they are doing.