Archive for the ‘Blog Article’ Category

Misunderstood Basic Concepts and the Greenhouse Effect

Tuesday, January 1st, 2013

If you Google the phrase ‘greenhouse effect definition’, you get the following at the top of the returned results:

green•house ef•fect
Noun
The trapping of the sun’s warmth in a planet’s lower atmosphere due to the greater transparency of the atmosphere to visible radiation from the sun than to infrared radiation emitted from the planet’s surface

Actually, the greenhouse effect would still operate even if the atmosphere absorbed just as much solar as it does infrared. When even Google gets the definition so wrong, how can mere mortals be expected to understand it?

The so-called greenhouse effect, which is an infrared effect, is admittedly not as intuitively obvious to us as solar heating. Every layer of the atmosphere becomes both a “source” as well as a “sink” of IR energy, which is a complication not faced with understanding solar heating, with the sun as the source.

To understand the greenhouse effect’s impact on surface temperature and the atmospheric temperature profile, there are some basic concepts which I continue to see misunderstandings about. If we can’t agree on these basics, then there really is no reason to continue the discussion because we are speaking different languages, with no way to translate between them.

The following list is not meant to be exhaustive, just the issues which people most commonly misunderstand, and so represent stumbling blocks to an understanding of how the greenhouse effect operates. (I have phrased them as the way in which I believe things really work.)

1) The greenhouse effect does not necessarily require solar heating. If the climate system was heated by intense geothermal energy rather than the sun, the greenhouse effect would still operate.

2) Temperatures in the climate system are the result of energy fluxes gained versus lost. An equilibrium temperature is reached only after the rate of energy absorbed by a layer (of atmosphere, soil, or water) equals the rate of energy loss. This is contrary to the common misconception that energy input alone determines temperature.

3) The greenhouse effect does not violate the 2nd Law of Thermodynamics. Just because the greenhouse effect (passively) makes the surface of the Earth warmer than if only (active) solar heating was operating does not violate the 2nd Law, any more than insulating your house more can raise its interior temperature in the winter, given the same energy input for heating. Very high temperatures in a system can be created with relatively small energy fluxes into that system *if* the rate of energy loss can be reduced (see #2, above). Again, energy input into a system does not alone determine what the temperature in the system will be.

4) The rate of IR absorption by an atmospheric layer almost never equals the rate of IR emission. IR emission is very dependent upon the temperature of that layer, approximately increasing as the 4th power of the temperature. But IR absorption is much less dependent on the temperature of the layer. So, for example, if you irradiated a very cold layer of air with intense IR radiation, that layer would warm until the rate of IR emission equaled the rate of absorption. But in the real atmosphere, other kinds of energy fluxes are involved, too, and so in general IR emission and absorption for a layer are almost never equal.

5) Each layer of the atmosphere does not emit as much IR upward as it does downward. There are people who try to attach some sort of cosmic significance to their claim that the atmosphere supposedly emits as much IR energy upward as it does downward, which is only approximately true for thin atmospheric layers. But the claim is false. Ground-based, upward-viewing IR radiometers measure much stronger levels of downward atmospheric emission than do space-based, downward-viewing radiometers of upward atmospheric emission. The reason is mostly related to the tropospheric temperature lapse rate. If the atmosphere was isothermal (vertically uniform in temperature) then upward and downward emission would be the same. But it’s not. Even if you restrict the analysis to very thin atmospheric layers, the upward emission will be slightly less than the downward emission, because it originates from an average altitude which is slightly higher, and thus colder (except in the stratosphere). (As an interesting aside, many models actually do make the approximation that their individual layers emit as much IR radiation upward as downward, yet they still successfully create an atmospheric temperature profile which is realistic).

6) The tropospheric temperature lapse rate would not exist without the greenhouse effect. While it is true that convective overturning of the atmosphere leads to the observed lapse rate, that convection itself would not exist without the greenhouse effect constantly destabilizing the lapse rate through warming the lower atmosphere and cooling the upper atmosphere. Without the destabilization provided by the greenhouse effect, convective overturning would slow and quite possible cease altogether. The atmosphere would eventually become isothermal, as the full depth of the atmosphere would achieve the same temperature as the surface through thermal conduction; without IR emission, the middle and upper troposphere would have no way to cool itself in the face of this heating. This scenario is entirely theoretical, though, and depends upon the atmosphere absorbing/emitting absolutely no IR energy, which does not happen in the real world.

I’ll be happy to post corrections/additions to the above list as warranted. I also apologize in advance for the inevitable snarky comments you will find a few people posting. I believe in letting people speak their mind.

Our Chaotic Climate System

Friday, December 14th, 2012

Over the last quarter century, mainstream climate science has changed dramatically, from a paradigm where climate changes naturally to one where climate forever remains the same unless humans meddle with it.

The reasons for this paradigm shift are clearly not based on science. Sure, you can always analyze some dataset in such a way that it gives the appearance of climate stasis (e.g. the hockey stick), but there is plenty of published research over the last 50 years supporting the view that climate changes naturally, and on all time scales…decadal, centennial, millennial, etc.

The claim that the Medieval Warm Period or Little Ice Age were only regional in extent is countered with considerable published evidence to the contrary. Besides…why is it that the pundits who claim these historic events were only regional in extent are the same people who place global significance on a U.S. drought or a heat wave in France? Hmmm?

No, the reasons for this paradigm shift are mostly political. Scientists play along for a variety of reasons which would take a series of blog posts to cover.

But they have been pretty successful at convincing the science-savvy public that climate will only change when we fire up our SUV, or turn on our incandescent light bulbs. The scientists say things like, “We tried putting natural forcings in our models, but we can get the models to produce the observed warming only when we include anthropogenic greenhouse gas emissions.”

Well, they only put in a few forcings which they know about: total solar irradiance changes, ozone depletion, and maybe a couple others.

But what about changes which are not “forced”?

Our Chaotic Climate System
Chaos theory was originally developed by Ed Lorenz during early experiments with computerized weather prediction models, the forerunners of today’s climate models. Lorenz found that, for example, even tiny changes in the initial state of the atmosphere can completely change how weather patterns evolve in the coming weeks. Chaos is what limits the predictability of weather to 10 days or so.

Chaotic behavior is a characteristic of most nonlinear dynamical systems, that is, systems which evolve over time and are governed by rather complex physical processes. We usually think of chaos in the atmosphere operating on time scales of days to weeks.

But the ocean is also a nonlinear dynamical system. And it has time scales ranging from years up to hundreds or even thousands of years…time scales we associate with climate change.

El Nino and La Nina can, for example, be thought of as a chaotic fluctuation in the climate system. Like the famous butterfly-shaped Lorenz Attractor, El Nino and La Nina are the two wings of the butterfly, and the climate system during Northern Hemisphere winter tends to alternate between El Nino and La Nina, sometimes getting “stuck” in a multi-year pattern of more frequent El Ninos or La Ninas.

Now, while El Nino and La Nina are the best known (and most frequently occurring) ocean-based climate phenomenon, what other longer-term modes of climate variability might there be which are “unforced”?) By unforced, I mean they are not caused by some external forcing mechanism (like the sun), but are just the natural results of how the system varies all by itself.) Well, we really don’t know, partly because so little research is funded to study the problem.

But How Can Chaos Cause “Global Warming”?
It is my belief that most climate variability and even climate change could simply be the result of chaos in the climate system. By how would changing ocean and atmospheric circulation patterns cause “global warming”?

One potential mechanism is through the impact of those circulation changes on cloud formation.

Clouds are the Earth’s natural sunshade, and very small (but persistent) changes in cloud cover can cause either warming or cooling trends. I know that scientists like Trenberth and Dessler like to claim that “clouds don’t cause climate change”…well, chaotic changes in ocean and atmospheric circulation patterns can change clouds, and so in that sense clouds act as an intermediary. Of course clouds don’t change all by themselves, which is how some people disingenuously characterize my position on this.

Unfortunately, our long-term measurements of global cloud cover are not yet good enough to determine with a high level of confidence just how much recent warming was caused by climate chaos. Our experiments with a simple 1D energy budget model suggests that more frequent El Ninos since the late 1970s caused some of the warming we have seen (a position also taken by Bob Tisdale), but just how much of the warming remains uncertain.

Part of the El Nino warming seems to be through reduced cloud cover, which precedes peak warming by 7 to 9 months. But it is also through a decrease in the rate at which the ocean mixes heat vertically. Chaotic changes in ocean mixing alone can cause global warming or cooling, even without any cloud changes, the result of the fact that most of the depth of the ocean is very cold, and only the near-surface is relatively warm. If the ocean was vertically uniform in temperature, changes in ocean mixing would have little effect on climate.

This is the basis for Trenberth’s “missing heat” argument. If recent warming has indeed been caused by our greenhouse gas emissions, but there has also been an increase in the rate of overturning of the oceans, then surface warming will be reduced as colder deep water is brought to the surface and the deep ocean is slightly warmed from the warm surface waters being mixed deeper than usual. Unfortunately, since the oceans are SO deep, the deep ocean warming we would be talking about verges on being unmeasurable…thousandths of a degree.

While such a “missing heat” explanation for a lack of recent warming is theoretically possible, I find it rather unsatisfying basing an unwavering belief in eventual catastrophic global warming on a deep-ocean mechanism so weak we can’t even measure it. Larger changes in individual ocean basins might be measurable, but it is the global average deep-ocean temperature that we need to know very accurately.

The Need for Natural Climate Change Research
This issue of natural mechanisms of climate change is so important it boggles my mind that the U.S. Government has had almost zero interest in funding it. But I don’t see how we will ever confidently determine just how much of recent warming is human-induced without determining how much was natural.

If, say, 50% of the warming in the last 50 to 100 years has been natural, then this profoundly impacts our projections of human-caused warming in the future, slashing them by about 50%.

In my talks to groups around the country over the years, I find widespread public support for the idea that climate does indeed change naturally. For the scientists who the public supports financially to largely ignore the issue, I fear that there will eventually be a public backlash which will end up hurting taxpayer support of climate research.

Unless they start behaving a little more like objective scientists, I predict that global warming researchers are living on borrowed time.

UAH v5.5 Global Temperature Update for November 2012: +0.28 deg. C

Wednesday, December 12th, 2012

After my extended trip to the West Coast, I am finally posting the global temperature update (sorry for the delay).

Our Version 5.5 global average lower tropospheric temperature (LT) anomaly for November, 2012 is +0.28 deg. C (click for large version):

The hemispheric and tropical LT anomalies from the 30-year (1981-2010) average for 2012 are:

YR MON GLOBAL NH SH TROPICS
2012 1 -0.134 -0.065 -0.203 -0.256
2012 2 -0.135 +0.018 -0.289 -0.320
2012 3 +0.051 +0.119 -0.017 -0.238
2012 4 +0.232 +0.351 +0.114 -0.242
2012 5 +0.179 +0.337 +0.021 -0.098
2012 6 +0.235 +0.370 +0.101 -0.019
2012 7 +0.130 +0.256 +0.003 +0.142
2012 8 +0.208 +0.214 +0.202 +0.062
2012 9 +0.339 +0.350 +0.327 +0.153
2012 10 +0.333 +0.306 +0.361 +0.109
2012 11 +0.281 +0.301 +0.262 +0.172

UAH v5.5 Global Temp Update for October 2012: +0.33 deg. C

Tuesday, November 6th, 2012

Our Version 5.5 global average lower tropospheric temperature (LT) anomaly for October, 2012 is +0.33 deg. C (click for large version):

The hemispheric and tropical LT anomalies from the 30-year (1981-2010) average for 2012 are:

YR MON GLOBAL NH SH TROPICS
2012 1 -0.134 -0.065 -0.203 -0.256
2012 2 -0.135 +0.018 -0.289 -0.320
2012 3 +0.051 +0.119 -0.017 -0.238
2012 4 +0.232 +0.351 +0.114 -0.242
2012 5 +0.179 +0.337 +0.021 -0.098
2012 6 +0.235 +0.370 +0.101 -0.019
2012 7 +0.130 +0.256 +0.003 +0.142
2012 8 +0.208 +0.214 +0.202 +0.062
2012 9 +0.339 +0.350 +0.327 +0.153
2012 10 +0.331 +0.302 +0.361 +0.106

Differences with RSS over the Last 2 Years
Many people don’t realize that the LT product produced by Carl Mears and Frank Wentz at Remote Sensing Systems has anomalies computed from a different base period for the average annual cycle (1978-1998) than we use (1981-2010). They should not be compared unless they are computed about the same annual cycle.

If the anomalies for both datasets are computed using the same base period (1981-2010), the comparison between UAH and RSS over the last couple of years looks like this:

Note that the UAH anomalies have been running, on average, a little warmer than the RSS anomalies for the last couple of years.

What Is Making Frankenstorm Sandy Exceptional?

Monday, October 29th, 2012

NASA MODIS image of Hurricane Sandy on 10/28/2012
As Hurricane Sandy completes it’s transformation into a strong nor’easter (“Frankenstorm Sandy“), I think it will likely break a record for the lowest barometric pressure for a “winter” cyclone in the Northeast U.S. There will also be widespread heavy rain, which almost always leads to some local record rainfall amounts depending upon just where rain bands happen to sit for an extended period of time.

But I predict it won’t break any records for wind speeds or snowfall. A few locations might see record storm surges, but again those have a lot to do with the chance superposition of wind direction, long fetch, and lunar tides.

So what has made Sandy so exceptional?

It is basically the “perfect storm” scenario of the chance timing of a tropical cyclone merging with an extra-tropical winter-type storm. Without Hurricane Sandy off the coast, the strong trough over the eastern U.S. (caused by cold Canadian air plunging southward) would have still led to a nor’easter type storm forming somewhere along the east coast of the U.S. But since Hurricane Sandy just happens to be in the right place at the right time to merge with that cyclone, we are getting a “superstorm”.

This merger of systems makes the whole cyclone larger in geographical extent than it normally would be. And this is what will make the surface pressures so low at the center of the storm.

This type of event is certainly not unprecedented, and something like it happens just about every year…just not over the Northeast U.S. These events are somewhat more common in the northwest Pacific Ocean or farther north in the Atlantic Ocean. We did an internal study of these events about ten years ago (never published) using QuikScat, AMSU, and buoy data, and it is amazing just how strong some of these hybrid winter storms can get.

So, while Frankenstorm Sandy will indeed have great local significance, it is premature to claim it has any global significance, such as a response to global warming. We would need to see more of these events occurring over many years, on a quasi-global basis…and even then the increase would need to be shown to be unrelated to natural climate modes of variability, such as the Pacific Decadal Oscillation or Arctic Oscillation.

Frankenstorm Sandy Approaches

Friday, October 26th, 2012

Hurricane Sandy’s interaction with an approaching trough and cold front from the west looks like it will provide a near repeat of the infamous Perfect Storm event, almost exactly 21 years to the day that the Andrea Gail was lost on October 28, 1991.

Hurricane Sandy’s “attempted” path would be to recurve off the East coast and head out to sea, as most East Coast hurricanes do. But the chance timing of the approaching upper level trough and cold front, which would favor some amount of coastal storm development anyway, will merge with Sandy and drag her back to a landfall somewhere (it appears at this point) from Chesapeake Bay to Long Island.

The instability caused by the clashing cold air mass and the warm, moist tropical air associated with Sandy will create a massive nor’easter, and the resulting storm has already been dubbed Frankenstorm Sandy since it will ruin Halloween trick-or-treating for many in the mid-Atlantic states and New England.

The high wind field will cover a large area, with highest winds not necessarily near the low center, since this will not be a hurricane per se. Large, probably record-setting amounts of rain can be expected in some localized areas wherever atmospheric lifting becomes maximized and persistent.

Wet snow can be expected on the western periphery of the storm, which right now looks to be from West Virginia through western Pennsylvania and extreme western New York state.

A 100-Year Storm?
Some are already saying this will be a 100-year storm. But this is a rather meaningless term. If it means the worst storm to hit New England in 100 years, I would doubt that will be the case, since there have been many epic nor’easters.

It is more likely that a few locations will see 100-year class precipitation totals, or maybe record low barometric pressures, and maybe even record high wind speeds.

Nevertheless, the formation of this particular storm will indeed be unusual, owing to the unusual combination of both a pre-existing tropical hurricane with extratropical cyclogenesis. There is little doubt that the resulting storm will indeed be historic, if not a record-setter, in its geographic extent and intensity.

Climate of Doubt about PBS’s Objectivity

Wednesday, October 24th, 2012

UPDATE (10/25/2012 2:30 p.m. CDT): As can be seen from the comments below, Catherine at PBS/Frontline says they did not interview me for this show. If this is the case, then the video of me used in the trailer was taken from some other interview. While John Christy and I seemed to recall a visit from PBS, we get a lot of film crews come through our offices (almost always freelancers who are hired for the job), and so I will have to check my old calendars (which are at home) to make sure whether someone claiming to represent PBS was here or not.

Another commenter noted that my words could be taken as either supporting the skeptics or the alarmists, depending on whether they are viewed in the context of being mixed in with Monckton & Ebell, or in the context of the whole show. Granted.

A saddened Big Bird leaves PBS studios after learning of journalistic malpractice.

This is just one more reason to defund PBS. If a “journalistic” organization cannot even provide some level of objectivity, why should the taxpayer be forced to support it?

From 0:18 to 0:21 in this trailer for the show “A Climate of Doubt”, I am seen talking about the U.S. government funding only research which supports global warming alarmism:

…yet, the viewer of the entire show will come away with the mistaken impression that I was instead talking about skeptics of manmade global warming being funded by shady organizations.

Note that PBS only showed my mouth in the above clip…why? Maybe because if my face was shown, people would find out that it was skeptic Roy Spencer speaking, and not some informant supposedly revealing the dastardly deeds of skeptics.

As I recall, I spent at least an hour with the PBS film crew outlining the skeptics case and why we speak out. None of it was used…except the small clip above, with the apparent intent to deceive the viewer.

Shame on PBS. They have now joined, along with BBC, my blacklist of news organizations to never do an interview with again. Fool me once….

PBS’s “Climate of Doubt”, tomorrow night

Monday, October 22nd, 2012

Since my “talking mouth” is in the trailer for Frontline’s Climate of Doubt, airing tomorrow evening (October 23), I suspect I’ll be in this show.

Here’s how the website introduces the special:

Four years ago, climate change was a hot issue and politicians from both sides seemed poised to act. Today public opinion on the climate issue has cooled considerably. Politicians either ignore it or proclaim their skepticism. What’s behind this massive reversal? On Oct 23, FRONTLINE goes inside the organizations that fought the scientific establishment to shift the direction of the climate debate.

Now, I’ve been giving public talks all over the country for a lot longer than four years, and I can tell you that mainstream America has always been skeptical of the theory that humans are killing the planet with our CO2 emissions. Nevertheless, politicians and the popular press tend to control the narrative, and for a long time the public was misled about the strength of the science behind global warming theory.

Note I said the public was misled about “strength of the science”, not “strength of scientists’ beliefs”.

And may I remind you that we let politicians, especially Al Gore, tell us what the scientists believed? Many scientists didn’t like the way Gore presented his case, with such certainty, and making connections that couldn’t really be made (like hurricanes and global warming).

Anyway, I fear this will be a hit piece against skeptics in general, that we are a bunch of Big Oil or Koch brother-funded hacks. I hope I’m wrong. (Personally, I am still waiting for some of that Big Oil or Koch money to come my way.)

Finally, I wonder whether PBS will even address what really killed public concern over anthropogenic global warming: even if Al Gore was right, it’s the economics that ends up killing global warming policy. As it is, current “green” policy is driving up petroleum prices and helping Big Oil make even more money, since we can’t live without petroleum. We’re “addicted to oil” in the same way we are “addicted to food”.

So, the “Climate of Doubt” isn’t as much doubt over the science (which the PBS special will emphasize) as it is doubt over the ability of any energy policy change to produce a net beneficial outcome.

It’s Time for the 99% to Start Supporting the 1%

Wednesday, October 17th, 2012

A persistent misconception about our economy is that the same amount of stuff is going to be produced, no matter what government policies are implemented. If that was indeed true, then the political debate only becomes one over how all that stuff is divided up. And that is indeed what many people spend their time debating.

But economic productivity can vary tremendously between countries, and even within a country over time. In fact, there are many poor countries with much lower unemployment than the United States…yet they remain poor.

What really matters for a prosperous nation is what is produced for a given amount of labor. We could have near-zero percent unemployment tomorrow if the government mandated that half the people should dig holes in the ground and the other half fill them up again. But we would be a very poor country, with a very low standard of living.

A high standard of living requires efficiency of production of goods and services that the people want, which in turn requires large investments in facilities, machinery, raw materials, etc. In a competitive free market economy, those investments involve risk…risk that your investment will be lost if someone else figures out a more efficient way to build 10 million smartphones than you figured out.

Now, why would anyone choose to invest large sums of money? Only if they have some hope of receiving much more in return if they are successful. If that incentive provided by the hope for profit is lost, then they will not invest in new business enterprises. No business enterprise for them means no jobs for you.

Our number one priority should be to ensure that producers are allowed to produce, and that they are not penalized for their success. Jobs happen from the top-down (not from the middle-out) when businesses with the money to hire people are allowed the opportunity to succeed.

Yes, a few of them will become rich in the process…but their riches pale in comparison to the greater riches enjoyed by society as a whole through the higher standard of living the good ideas of the rich have enabled. And those profits aren’t kept under a mattress…they are reinvested in the economy, either through expanding the business, hiring more people, or even just buying more stuff which supports other businesses.

Demonizing the rich is demonizing the driving force which elevates the standard of living of the whole country. If you want prosperity, allow the producers to produce. Make it easier for them, not harder.

Not only does this raise our standard of living, it also increases tax revenue, because revenue is a percent of the action, and the more economic activity there is, the greater the tax revenue which is collected to support government services.

And this is how the budget “arithmetic” really works. Balancing the federal budget is not a matter of either (1) increasing tax rates or (2) decreasing spending. That erroneous view mistakenly equates tax rates with tax revenue. Tax revenue (the total number of dollars taken in by the government) is the tax rate multiplied by economic activity. Lowering tax rates, especially on businesses, stimulates economic activity, which then increases tax revenue.

You Don’t Really Want to Play by the Same Rules

For those who like the mantra “everyone should play by the same rules”, let me tell you: you don’t really want to play by the same rules as business. Business owners typically don’t take their share until all of their employees are paid and all of their other business bills are paid.

For every successful rich person, there were many more who tried to become rich but lost everything. Why is it that so many people want a greater share from those who have succeeded, but don’t want to share in the losses of those who failed to become rich?

What if the business you work for fails? How would you like to pay back all of the salary you earned? You got to keep the money, but the business owner or his/her investors lost that money. Do you really want to play by those rules?

And how would you like to work 12+ hours per day trying to abide by all of the regulations increasingly heaped upon businesses by the government?

It’s time for the 99% to start supporting the 1% a little better, because in the end it is the 1% who enables the 99% to maximize their standard of living.
_____________________________

You can learn more about basic economics from my book Fundanomics: The Free Market Simplified.

UAH V5.5 Global Temp. Update for Sept. 2012: +0.34 deg. C

Friday, October 5th, 2012

As discussed in my post from yesterday, the spurious warming in Aqua AMSU channel 5 has resulted in the need for revisions to the UAH global lower tropospheric temperature (LT) product.

Rather than issuing an early release of Version 6, which has been in the works for about a year now, we decided to do something simpler: remove Aqua AMSU after a certain date, and replace it with the average of NOAA-15 and NOAA-18 AMSU data. Even though the two NOAA satellites have experienced diurnal drifts in their orbits, we have found that those drifts are in opposite directions and approximately cancel. (The drifts will be corrected for in Version 6.0).

The new interim dataset, Version 5.5, has a September, 2012 global lower tropospheric temperature anomaly of +0.34 deg. C (click for large version):

Note that the new v5.5 dataset brings our monthly anomalies over the last few years somewhat more in line with those from RSS, which have been running significantly cooler than ours. The trend change from v5.4 to v5.5, however, only decreases by 0.001 deg. C/decade. This is partly because the time series is now almost 34 years in length, and adjusting the last several months by 0.1 deg or so is not going to affect the long-term trend substantially.

Evidence of the divergence of Aqua from the two NOAA satellites during 2012 is shown in the next plot:

The global monthly differences between v5.5 and v5.4 are shown next, which reveals the rapid divergence in the last couple months of Aqua AMSU from the average of NOAA-15 1nad NOAA-18 AMSUs:

The Version 5.5 hemispheric and tropical LT anomalies from the 30-year (1981-2010) average since January 2010 are:

YR MON GLOBAL NH SH TROPICS
2010 01 0.581 0.747 0.415 0.660
2010 02 0.542 0.623 0.461 0.738
2010 03 0.577 0.721 0.434 0.665
2010 04 0.416 0.609 0.223 0.596
2010 05 0.449 0.593 0.306 0.679
2010 06 0.376 0.430 0.321 0.464
2010 07 0.343 0.455 0.232 0.303
2010 08 0.376 0.480 0.273 0.216
2010 09 0.430 0.351 0.510 0.114
2010 10 0.278 0.232 0.324 -0.053
2010 11 0.208 0.316 0.100 -0.270
2010 12 0.141 0.207 0.075 -0.441
2011 01 0.022 0.036 0.007 -0.382
2011 02 -0.003 0.005 -0.011 -0.350
2011 03 -0.066 -0.013 -0.120 -0.336
2011 04 0.083 0.132 0.034 -0.233
2011 05 0.101 0.082 0.120 -0.061
2011 06 0.260 0.292 0.229 0.183
2011 07 0.343 0.290 0.396 0.169
2011 08 0.300 0.247 0.353 0.143
2011 09 0.290 0.280 0.301 0.128
2011 10 0.073 0.140 0.006 -0.152
2011 11 0.084 0.072 0.096 -0.060
2011 12 0.066 0.119 0.012 -0.033
2012 01 -0.134 -0.060 -0.203 -0.256
2012 02 -0.135 0.018 -0.289 -0.320
2012 03 0.051 0.119 -0.017 -0.238
2012 04 0.232 0.351 0.114 -0.242
2012 05 0.179 0.337 0.021 -0.098
2012 06 0.235 0.370 0.101 -0.019
2012 07 0.130 0.256 0.003 0.142
2012 08 0.208 0.214 0.202 0.062
2012 09 0.338 0.349 0.327 0.155

Again, Version 5.5 is only meant as an interim solution until our Version 6 is ready, which has new corrections for diurnal drift and an improved calibration strategy for the old MSU instruments.

Our reluctance to make these changes sooner is partly due to the flak we get when we are accused of adjusting temperatures downward for no good reason. There is now sufficient evidence (alluded to above) to make such adjustments.