UPDATED: Fixed Bureau of Reclamation study link, added Colorado River basin snowpack graph and discussion.
In today’s news is yet another article claiming the record-low water levels in Lake Mead (a manmade water reservoir) are due to human-caused climate change. In fact, to make the problem even more sinister, the Mafia is also part of the story:
While it is true that recent years have seen somewhat less water available from the Colorado River basin watershed (which supplies 97% of Lake Mead’s water), this is after years of above-average water inflow from mountain snowpack. Those decadal time-scale changes are mostly the result of stronger El Nino years (more mountain snows) giving way to stronger La Nina years (less snow).
The result is record-low water levels:
Lake Mead water levels since the construction of Hoover Dam (source: NBC News)
But the real problem isn’t natural water availability. It’s water use.
The following graph shows the fundamental problem (click for full resolution). Since approximately 2000, water use by 25 million people (who like to live in a semi-desert area where the sun shines almost every day) has increased to the point that more water is now being taken out of the Lake Mead reservoir than nature can re-supply it.
This figure is from a detailed study by the U.S. Bureau of Reclamation. As long as that blue line (water supply) stayed above the red line (water use), there was more than enough water to please everyone.
But now, excessive demand for water means Lake Mead water levels will probably continue to decline unless water use is restricted in some way. The study’s projection for the future in the above figure, which includes climate model projections, shows little future change in water supply compared to natural variability over the last century.
The real problem is that too much water is being taken out of the reservoir.
As long as the red line stays above the blue line, Lake Mead water levels will continue to fall.
But to blame this on climate change, whether natural or anthropogenic, ignores the thirsty elephant in the room.
UPDATE: Since it was pointed out in comments (below) that the latest Bureau of Reclamation study is rather dated (2012), and supposedly the drought has worsened since then, here’s a plot of the Colorado River basin April (peak month) snowpack, which provides about 50% of the water to Lake Mead. The rest is provided in the non-mountainous areas of the river basin, which should be highly correlated with the mountainous regions. I see no evidence for reduced snowpack due to “climate change”… maybe the recent drought conditions are where the demand by 25 million water consumers originates from, causing higher demand?
SUMMARY: A simple time-dependent CO2 budget model shows that yearly anthropogenic emissions compared to Mauna Loa CO2 measurements gives a declining CO2 sink rate, which if continued would increase atmospheric CO2 concentrations and presumably anthropogenic climate change. But accounting for ENSO (El Nino/La Nina) activity during 1959-2021 removes the decline. This is contrary to multiple previous studies that claimed to account for ENSO. A preprint of my paper (not yet peer reviewed) describing the details is atENSO Impact on the Declining CO2 Sink Rate | Earth and Space Science Open Archive (essoar.org).
I decided that the CO2 model I developed a few years ago, and recently reported on here, was worthy of publication, so I started going through the published literature on the subject. This is a necessary first step if you want to publish a paper and not be embarrassed by reinventing the wheel or claiming something others have already “disproved”.
The first thing I found was that my idea that Nature each year removes a set fraction of the difference between the observed CO2 concentration and some baseline value is not new. That idea was first published in 2013 (see my preprint link above for details), and it’s called the “CO2 sink rate”.
The second thing I found was that the sink rate has (reportedly) been declining, by as much as 0.54% (relative) per year, even after accounting for ENSO activity. But I only get -0.33% per year (1959-2021) before accounting for ENSO activity, and — importantly — 0.0% per year after accounting for ENSO.
This last finding will surely be controversial, because it could mean CO2 in the atmosphere will not rise as much as global carbon cycle modelers say it will. So, I am posting the model and the datasets used along with the paper preprint at ENSO Impact on the Declining CO2 Sink Rate | Earth and Space Science Open Archive (essoar.org). The analysis is quite simple and I believe defensible. The 2019 paper that got -0.54% per year decline in the sink rate uses complex statistical gymnastics, with a professional statistician as a primary author. My analysis is much simpler, easier to understand, and (I believe) at least as defensible.
The paper will be submitted to Geophysical Research Letters for peer review in the next couple days. In the meantime, I will be inviting the researchers who live and breathe this stuff to poke holes in my analysis.
The Version 6.0 global average lower tropospheric temperature (LT) anomaly for July, 2022 was +0.36 deg. C, up from the June, 2022 value of +0.06 deg. C.
The linear warming trend since January, 1979 still stands at +0.13 C/decade (+0.11 C/decade over the global-averaged oceans, and +0.18 C/decade over global-averaged land).
Various regional LT departures from the 30-year (1991-2020) average for the last 19 months are:
The full UAH Global Temperature Report, along with the LT global gridpoint anomaly image for July, 2022 should be available within the next several days here.
The global and regional monthly anomalies for the various atmospheric layers we monitor should be available in the next few days at the following locations:
The simple CO2 budget model I introduced in 2019 is updated with the latest Mauna Loa measurements of atmospheric CO2 and with new Energy Information Administration estimates of global CO2 emissions through 2050. The model suggests that atmospheric CO2 will barely double pre-industrial levels by 2100, with a total radiative forcing of the climate system well below the most extreme scenario (RCP8.5) used in alarmist literature (and the U.S. national climate assessment), with the closest match to RCP4.5. The model also clearly shows the CO2 reducing effect of the Mt. Pinatubo eruption of 1991.
The Model
As described here, the simple CO2 budget model uses yearly sources and sinks of atmospheric CO2 to compute how much the atmospheric CO2 concentration changes from one year to the next.
The sink (removal) of “excess” atmospheric CO2 assumes that all of the biological and geophysical processes that remove CO2 from the atmosphere do so at a net rate proportional to the excess of the CO2 value above some ‘equilibrium’ value. When the model is calibrated with the yearly Mauna Loa CO2 data from 1959 through 2021, this rate of removal is 2.02% of the atmospheric excess above 294 ppm. So, for example, at the current CO2 concentration of 417 ppm, the biological and geophysical removal processes are removing 0.0202 x [417 – 294] = 2.48 ppm per year for 2022 (preliminary estimate).
The long-term source of CO2 increase is assumed to be anthropogenic. There are various estimates of yearly CO2 emissions, some from energy use alone, some including cement production and land use. I’ve used the Boden et al. (2017) and Our World in Data yearly estimates for 1750 through 2009, and EIA.gov estimates of yearly emissions growth rates from 2010 to 2050, and then assumed their 2050 growth rate is constant to 2100.
I also have included an ENSO term (El Nino and La Nina) to empirically account for CO2 rising during El Nino and decreasing during La Nina. This term amounts to 0.45 times the Multivariate Enso Index (MEI) value averaged from May of the previous year through April of the current year. For example, the latest yearly-average MEI value is -1.29 (La Nina conditions), so 0.45 x [-1.29] = -0.58 ppm CO2 decrease in 2022 from La Nina activity.
The model is initialized in 1750. The MEI data are included starting in 1958-59.
Results
The model fit to Mauna Loa CO2 data is shown in Fig. 1. Note that the largest discrepancies between model and observations are due to major volcanic eruptions, especially Mt. Pinatubo in 1991.
Fig. 1. Model versus observed CO2 concentrations at Mauna Loa, HI.
Contrary to popular perception, these eruptions actually remove CO2 from the atmosphere. This is likely due to increased photosynthesis due to a large increase in diffuse solar radiation from the sky, from sunlight scattered by volcanic aerosols, which can penetrate deeper into vegetation canopies.
When we run the model using 2021 EIA estimates of yearly CO2 emissions increases from 2010 through 2050, and then assuming the 2050 increase remains the same to 2100, the resulting atmospheric CO2 scenario is closest to the IPCC RCP4.5 scenario. The model CO2 concentration barely reaches the 2XCO2 level, a doubling of the pre-industrial CO2 level.
Fig. 2. As in Fig. 1, but extended to 2100, with the various IPCC radiative forcing scenarios used in recent IPCC reports.
Note the model is well below the RCP8.5 scenario, which is the one most often used to promote alarmist projections of sea level rise, temperature increase, etc. The weaker the future radiative forcing from increasing CO2, the weaker resulting climate change will be.
Discussion
Climate model projections depend critically upon how much atmospheric CO2 will rise in the future. That, in turn, depends upon (1) future anthropogenic emissions, and (2) how fast nature removes “excess” CO2 from the atmosphere.
A simple budget model of the atmospheric CO2 concentration very accurately matches the Mauna Loa CO2 data during 1959-2021 using yearly estimates of global anthropogenic CO2 emissions as a CO2 source, and the observed average rate of removal of CO2 by biological and physical processes, which is proportional to the “excess” of atmospheric CO2 over a baseline of 295 ppm as a sink. An empirical factor to account for El Nino and La Nina activity is also included, which mostly affects year-to-year fluctuations in CO2.
The resulting model projection produces atmospheric CO2 concentrations late this century well below the IPCC RCP8.5 scenario, and even below the RCP6.0 scenario. This suggests that the most dire climate change impacts the public hears about will not happen. Note that this likely reduction in future global warming impacts is in addition to the evidence that the climate system is not as sensitive to increasing CO2 as is claimed by the IPCC. In other words, future climate change will likely be much weaker than projected due not only to (1) lower climate sensitivity, but also (2) weaker anthropogenic forcing, and it is the combination of the two that determines the outcome.
There was no way to tell from yesterday’s White House press conference release of the first JWST “sea of galaxies” image whether it was any better or different from Hubble Space Telescope (HST) views of the same region.
In my opinion, this was a missed opportunity to wow the public.
But since then, amateur astronomer Nicholas Eggleston has stepped up with an overlay of the two telescopes’ images, aligned to view the same region. We all know how wonderful the HST views have been, even resolving individual stars in the distant Andromeda galaxy. Well, the James Webb Space Telescope (in addition to being infrared) has now demonstrated its resolution blows HST away. Check out the light arcs due to gravitational lensing (click on his page link so you can use the slider functionality, which probably won’t work on smart phones):
If you cannot see the 2 images with a slider separating them, here is an animated GIF I put together:
If you want higher resolution, right-click the animated GIF and download it (“Save image as…”) to your desktop. Then you can view it at full resolution.
The Version 6.0 global average lower tropospheric temperature (LT) anomaly for June, 2022 was +0.06 deg. C, down (again) from the May, 2022 value of +0.17 deg. C.
Tropical Coolness
The tropical (20N-20S) anomaly for June was -0.36 deg. C, which is the coolest monthly anomaly in over 10 years, the coolest June in 22 years, and the 9th coolest June in the 44 year satellite record.
The linear warming trend since January, 1979 still stands at +0.13 C/decade (+0.11 C/decade over the global-averaged oceans, and +0.18 C/decade over global-averaged land).
Various regional LT departures from the 30-year (1991-2020) average for the last 18 months are:
The full UAH Global Temperature Report, along with the LT global gridpoint anomaly image for June, 2022 should be available within the next several days here.
The global and regional monthly anomalies for the various atmospheric layers we monitor should be available in the next few days at the following locations:
…But the price per mile of EVs energy use is cheaper for the time being ($2 per gallon of gas equivalent)
Photo credit: Insideevs.com.
Most of the electricity generated in the U.S. continues to come from fossil fuels (61% in 2021). This is not likely to change much in the future as electricity demand is increasing faster than renewables (20% of total in 2020 and 20.1% of total in 2021) can close the gap versus fossil fuels. Given that fact, it is interesting to ask the question:
Which uses fossil fuels more efficiently, an EV or ICE (internal combustion engine) vehicle?
Most of what you will read about EVs versus ICE vehicles discuss how EVs are more efficient at converting the energy they carry into motion (e.g. here, and here , and here, and here, etc.) but this is only part of the equation. The generation, transmission, and battery storage of electricity is very inefficient compared to the refining and transport of gasoline, and those inefficiencies each year add up to more than all of the gasoline consumed in the U.S.
EV Energy Usage per Mile
The average energy consumption of an EV vehicle is about 0.35 kWh per mile. At the U.S. average electricity price of $0.145 per kWh in June 2022, and assuming the 2021 average new car fuel economy of 39 mpg, this makes the ICE-equivalent fuel price of an EV $1.98 per gallon of gasoline. With the U.S. average price of gas now over $5.00 a gallon, this by itself (ignoring the many other considerations, discussed below) makes the EV attractive for month-to-month savings on fuel purchases.
But since most of this electricity still comes from fossil fuels, we must factor in the efficiency with which electricity is generated and transmitted and stored in the EV’s battery. This is how we can answer the question, Which uses fossil fuels more efficiently, an EV or ICE (internal combustion engine) vehicle?
How does the internal combustion engine stack up against the EV in terms of efficiency of fossil fueled energy use?
A gallon of gas contains 33.7 kWh of energy. But like the generation of electricity, it takes energy to extract that gallon of gas from petroleum. However, the refining process is very energy efficient (about 90%), so it takes (33.7/0.9=) 37.44 kWh of energy to obtain that 33.7 kWh of energy is a gallon of gas. At the 39 mpg gas mileage of 2021 cars, this gives an energy economy number of 0.96 kWh per mile driven, which is just below the 1.0 kWh fossil fuel energy usage of an EV. With advertised fuel economy of 48 to 60 mpg, hybrid vehicles (which are gasoline powered) would thus have an advantage over EVs.
Other Considerations
Of course, the main reason EVs are being pushed on the American people (through subsidies and stringent CAFE standards) is the reduction in CO2 emissions that will occur, assuming more of our electricity comes from non-fossil fuel sources in the future. I personally have no interest in owning one because I want the flexibility of travelling long distances in a single day.
There is also the issue of the large amount of additional natural resources, and associated pollution, required to make millions of EV batteries.
Furthermore, the electrical grid will need to be expanded to provide the increase in electricity needed. This greater electricity demand, along with the high cost of wind and solar energy, might well make the fuel cost advantage of the EV disappear in the coming years.
Finally, a portion of the true price of a new EV is hidden through subsides (which the taxpayer pays for) and high CAFE fuel economy regulations, which require auto manufacturers not meeting the standard to pay companies like Tesla, a cost which is passed on to the consumer through higher prices on ICE cars and (especially) trucks.
As early as today at noon EDT an Astra rocket will launch the first two of six small CubeSats into tropical orbits at 550 km altitude from Cape Canaveral (video coverage here). Those satellites, named the TROPICS mission, will carry microwave radiometers operating at relatively high frequencies, from 90 to 205 GHz. They will measure precipitation-size ice particles in the upper reaches of tropical storms (at 90 and 205 GHz), and provide temperature (118 GHz channels) and humidity profiles (183 GHz channels) in the environment surrounding, and inside the warm cores of those storms.
History The use of microwave radiometers in space to observe the Earth was first proposed by German meteorologist Konrad Buettner in 1963. By the late 1960s aircraft missions were being flown by NASA and the U.S. Weather Bureau to demonstrate the technology.
In the early 1970s NASA/Goddard was the leader in the construction and flight of the first spaceborne microwave radiometers to demonstrate the measurement of precipitation and sea ice with the single-frequency ESMR instruments operating at 19.35 and 37 GHz.
Later, JPL developed the SMMR instrument that provided measurements from 6.6 to 37 GHz starting in late 1978 on the Nimbus-7 satellite. That instrument allowed me as a post-doc at UW-Madison to demonstrate the ability to measure precipitation over land by isolating the ice scattering signature at 37 GHz, which gave me my first peer-reviewed paper (a cover article in Nature) and led to several more papers describing severe thunderstorm detection and rainfall measurement. It is this ice scattering signature that will be exploited as part of the TROPICS mission.
The field of researchers in satellite passive microwave remote sensing up to the Nimbus-7 SMMR period was pretty small. It was the 1987 launch of the first SSM/I instrument on a DoD DMSP satellite that got many more researchers involved in using satellite microwave measurements. The unusual inclusion of higher-frequency 85.5 GHz channels on SSM/I exploited the ice signature that Dick Savage (UW-Madison) also documented along with Jim Weinman.
Meanwhile, the JPL-built MSU instruments were providing atmospheric temperature profile data since late 1978 in the 60 GHz oxygen absorption complex, mostly for data input to weather forecast models. Later, NOAA and NASA/GSFC developed the higher-resolution AMSU instruments, first launched in 1998, which still provide global atmospheric temperature information.
It was around 1990 that John Christy and I demonstrated that the MSU/AMSU series of temperature-profiling instruments could provide a relatively stable long-term record of global atmospheric temperatures for monitoring of climate change. While I had helped with the early planning of NASA’s TRMM satellite to monitor tropical rainfall, I chose to change my career focus from precipitation monitoring to temperature monitoring, and declined to become part of the official TRMM Team. I would still become the U.S. Science Team leader for the AMSR-E instrument (built by Japan), which like SSM/I measured a wide variety of parameters, but my time would be increasingly devoted to the global temperature monitoring effort.
The radical TROPICS approach to tropical cyclone monitoring One of the main advantages of passive microwave measurements is the ability to see through many clouds (especially cirrus). Unfortunately, satellite passive microwave radiometry has always struggled with poor spatial resolution. The beamwidth of a diffraction-limited microwave antenna increases with the wavelength of radiation being measured, so large antennas are required to obtain high resolution on the Earth’s surface. Or, shorter wavelength (higher frequency) channels need to be utilized. The trouble with high frequencies, though, is that clouds become more opaque as the frequency increases. If you increase the frequency too high you move into the infrared, and we already have geostationary satellites providing those data on a continuous basis.
The TROPICS approach employs both a lower altitude (550 km compared to 700 to 850 km from other satellites) and higher frequencies (90 to 205 GHz) to improve spatial resolution. Microwave circuit electronics have improved in sensitivity and noise and have been reduced in size to the point that the very small and lightweight CubeSat architecture can be used. The TROPICS satellites use 3 CubeSat modules, making the satellite bus only (approximately) 4 x 4 x 12 inches in size. The mission’s Principal Investigator, William Blackwell at MIT’s Lincoln Lab has been spearheading this new, smaller microwave radiometer concept. The temperature sounding utilizes the 118 GHz oxygen absorption complex, rather than 60 GHz (as with the AMSUs) which improves spatial resolution by a factor of two. Such small satellites can be launched in batches with smaller rockets, which reduces cost. It also aligns with NASA’s overriding interest in satellite technology advancement. With six of these satellites in a low-inclination (30 deg) orbit, quasi-hourly coverage (on average) of tropical cyclones is anticipated.
What will forecasters do with the data? Many years ago I visited both the National Hurricane Center (Florida) and the Joint Typhoon Warning Center (Hawaii) to promote the use of passive microwave measurements of tropical storms. Since then, several researchers (e.g. Chris Velden and Mark DeMaria) have worked tirelessly with these centers to establish procedures for passive microwave monitoring of tropical cyclones.
I suspect the measurements will mostly be used to better isolate where a tropical depression or storm might be forming since the microwave measurements penetrate most cirrus cloud cover and can better reveal the low-level swirl of clouds marking the circulation center. Regarding the monitoring of a tropical cyclone’s warm core (what causes the intense low pressure at the surface) it isn’t until the storm comes close to hurricane intensity (65 knot maximum sustain surface winds) that the warm core can be reliably measured by TROPICS satellites. Also, the 27 km (best case) spatial resolution for the 118 GHz temperature sounding channels is still a little too coarse to avoid precipitation contamination (a cold signature) of the warm core signature that typifies tropical cyclones. Nevertheless, I’m sure researchers will find clever ways to isolate the warm core signature, just as we did in Spencer et al. (2001) using AMSU satellite data at 50 km resolution.
There will no doubt be some new capabilities that emerge as data are gathered from these satellites. For example, a large hurricane with frequent coverage by TROPICS satellites might reveal rapid deepening of the storm through a stronger warm core signature. In the absence of Hurricane Hunter flights into the storms, storm intensity is still largely based upon weather satellite visible and infrared cloud features, which is a rather indirect (but surprisingly accurate) technique that has a long history. TROPICS offers the chance to have a more physics-based measurement of hurricane intensity than through cloud appearance alone. Also, since data will continuously be collected over the entire tropical latitude belt, a wide variety of other applications will arise.
Let’s hope the Astra launch (maybe today) will be successful. So far, there have only been two successful Astra launches out of eight attempts (one rather dramatic failure is shown below). Fingers crossed.
The Version 6.0 global average lower tropospheric temperature (LT) anomaly for May, 2022 was +0.17 deg. C, down from the April, 2022 value of +0.26 deg. C.
The linear warming trend since January, 1979 still stands at +0.13 C/decade (+0.12 C/decade over the global-averaged oceans, and +0.18 C/decade over global-averaged land).
Various regional LT departures from the 30-year (1991-2020) average for the last 17 months are:
The full UAH Global Temperature Report, along with the LT global gridpoint anomaly image for May, 2022 should be available within the next several days here.
The global and regional monthly anomalies for the various atmospheric layers we monitor should be available in the next few days at the following locations:
The Version 6.0 global average lower tropospheric temperature (LT) anomaly for April, 2022 was +0.26 deg. C, up from the March, 2022 value of +0.15 deg. C.
The linear warming trend since January, 1979 still stands at +0.13 C/decade (+0.12 C/decade over the global-averaged oceans, and +0.18 C/decade over global-averaged land).
Various regional LT departures from the 30-year (1991-2020) average for the last 16 months are:
The full UAH Global Temperature Report, along with the LT global gridpoint anomaly image for April, 2022 should be available within the next several days here.
The global and regional monthly anomalies for the various atmospheric layers we monitor should be available in the next few days at the following locations: