Systemic Error in Global Temperature Measurement

Spread the love

Moritz Büsing

Some time ago I stumbled upon a curiosity in temperature measurement publications of the last 30 years:

When you turn the temperature anomaly curves into absolute temperature curves then the past has been getting colder.

The decade 1880 to 1890 is in recent publications 0.3°C (0.5°F) colder than it was in publications from 15 years ago, and 0.5C (0.9°F) colder than it was in publications 30 years ago. The weather station database for land surface temperatures of this time period has not changed much in the last 30 years, but the analysis methods have changed.

Therefore, I went down the rabbit hole and tried to understand how one analyzes the data from thousands of weather stations at many different locations with changing technologies over time. Here I found a systematic error in one of the most important analysis processes: homogenization

Homogenization consists of removing stepwise breaks and trends in the data series that result from non-climate related sources. For example, relocating a weather station from the top of a mountain to the valley can cause a permanent offset in temperature measurements. Also using a new type of thermometer or a new type of housing of the thermometer can permanently change the measured temperatures. These changes lead to stepwise breaks in the data series. Other changes, such as urbanization, lead to non-climate related trend changes in the data series that are also permanent. These permanent errors are corrected by increasing or decreasing all the past data at a stepwise break such that the temperature curve becomes continuous (This process is not trivial, and I will not elaborate on it here).

Here I discovered the error:

Not all non-climate related changes are permanent.

Especially the ageing effects of the paint or plastic of a weather station housings are removed, when the housing is repainted or replaced. But after the aging effects have been removed, the new paint or plastic starts to age again. A study by a team at the Istituto Nazionale die Ricerca Metrologica in Turin, Italy Comparative analysis of the influence of solar radiation screen ageing on temperature measurements by means of weather stations confirms that this ageing effect is real.

This alone would not be a big issue. The ageing effect only reaches 0.1-0.2°C (0.18-0.36°F) difference which would be negligible, and indeed undetectable by the homogenization algorithms. The homogenization algorithms can neither detect such a small warming trend from aging nor the tiny downward stepwise break from renewal. However, when other sources for larger stepwise breaks (change in location, new instrumentation) coincide with repainting, replacing or at least cleaning of the housing, then a systematic error occurs where these small steps are added up each time.

While the aging effect is too small to detect in individual weather stations due to the noisy data, it is still large enough to detect in the changes of temperature trends in a statistical analysis of thousands of weather stations. So, I analyzed the homogenized data sets from the National Centers of Environmental Information (NCEI) in comparison with the non-homogenized data sets. Here I was indeed able to identify and quantify the ageing effects.

On average, a stepwise break is corrected once in every 19 years of weather station data. Therefore, there are on average roughly 7 “corrections” of the weather station data of the last 140 years. Even a small aging effect of 0.1°C would then lead to roughly 0.7°C of erroneously recorded global warming!

This is only a rough estimate, so I looked at the global land surface temperature calculation GISTEMP from the Goddard Institute for Space Studies (GISS). They did a really good job in making their methods transparent and offered all their tools for download online, so everybody can reproduce their results. I corrected the aging effect in the homogenized data set and ran this corrected data set with the tool from the GISTEMP team. The result is a reduction of the temperature change between the decades 1880-1890 and 2010-2020 from 1.43°C to 0.83°C CI (95%) [0.46°C; 1.19°C].

This result also shows a better fit with satellite data provided by the University of Alabama in Huntsville (UAH):

I collected all the sources and wrote a paper about my findings: Correction of Systematic Error in Global Temperature Analysis Related to Aging Effects. I tried to publish this paper in four different peer reviewed journals, but it was always rejected with canned answers (“…our readers would not be interested…”) even before it was reviewed by a peer.

My methods were very careful, and I made several conservative assumptions. In the paper I also quantify a less conservative analysis, which leads to only 0.41°C global warming within 140 years.

One more interesting finding is, that the corrected temperature curve is a worse fit with the CO2 concentrations.  The R² values (statistical number identifying how much one data set predicts another data set) of the resulting temperature curves and the base 2 logarithm of CO2 (temperature change per doubling of CO2) are the following:

– GISTEMP:  up to 92%

– Corrected conservative mean: up to 73%

– Estimate of the corrected mean without conservatisms: up to 36%

This means, that a smaller fraction of global warming is caused by CO2. So, for the conservative case up to 73% of 0.83°C global warming, i.e. at most 0.61°C, are caused by CO2. For the less conservative case only up to 36% of 0.41°C global warming, i.e. at most 0.15°C, are caused by CO2.

These temperature data curves are the basic input data for many other studies and are the calibration targets of many climate models. This will revolutionize climate science, if my findings are confirmed.

via Watts Up With That?

August 30,2022