Contributed by Robert Lyman © 2023. Robert Lyman’s bio can be read here.
ClimateData.ca is a climate data portal supported financially by Environment and Climate Change Canada (ECCC) and allegedly produced by Canada’s “leading climate organizations”. The stated goal of the portal is to “support decision makers across a broad spectrum of sectors and locations by providing the most up-to-date climate data in easy to use formats and visualizations.”
The unsuspecting reader might think that the portal focuses its reporting on the past and current changes in Canada’s temperatures, weather and climate. That would be factual reporting. Instead, if you open the portal for the city in which you live, as I did for Ottawa, you mostly find a set of alarming projections about what the area’s temperatures and precipitation will be for the 2021-2050, 2051-2080 and 2070-2100 time periods. In other words, it is a set of forecasts based upon computer modelling and using scenarios developed for the Intergovernmental Panel on Climate Change (IPCC).
The purpose of this note is to explain why those projections may seriously mislead any person really interested in understanding what the future may hold. No one, of course, knows that. To project it first requires a fair-minded and accurate establishment of the “base case” – in other words, an understanding of what has happened to climate in the past and what the continuation of past trends would indicate. Then, one has to imagine and quantify what will happen in future.
All attempts to predict the future are laden with uncertainties and often tainted by the political agendas of the forecasters. The best that anyone can do is to assemble what the best current studies show are the relationships between natural and human influences on the climate; the comparative influences of different economic, scientific and population trends; and the range of possibilities about how these things may change in future. Part of this guessing process is to experiment with different possibilities, packaged together as “scenarios”.
Understanding the Past
Estimates of historical temperatures up until a century ago are based on “proxy” estimates, as there were no direct measurements. The only direct and reliable measures of global average temperatures are those provided since satellites began providing data in 1979, so any claims to precision before that time are false. There, however, is a general consensus that, to the extent it is possible to measure “average global temperature”, it has risen about 1.1 degree Celsius over the period 1870 to 2020. Temperatures were as warm or warmer in the Minoan Period 3,000 years ago, the Roman Period 2000 years ago and the Medieval Warm Period 1,000 years ago as they are today.
It is especially difficult to determine what has changed temperatures in the past, and especially in the period since the mid-nineteenth century. Climate and temperatures are affected by many influences, including notably solar activity, the oceans and other natural cycles plus the effects of human activities. The earth is still warming as a result of the transition since the Little Ice Age in the seventeenth century. The measurement of changes in temperatures is greatly complicated by the urban heat island effect, which contaminates the instrument temperature records, and the decline in operating stations (the number of stations dropped from 6000 globally in the 1970s to 2600 by 1997). A study by McKitrick and Michaels in 2007 showed that over half of the warming over land since 1980 in instrument data sets is due to the urban heat island effect.
An understanding of the past is important to assess the relationship between increases in the concentration of greenhouse gases (measured in terms of carbon dioxide equivalent) in the atmosphere and changes in global temperatures, if any. These are inputs into economic computer programs called integrated assessment models (IAM), of which the most commonly used are called PAGE, DICE and FUND. The IAM links projections (i.e. educated guesses) of future population, technology, economics with emissions scenarios. Most of the public reporting that people see is based on the DICE and PAGE models. The FUND model, however, is arguably superior because the other two do not take into account the significant benefits to the world of carbon dioxide fertilization of crops and forests.
The IAM attempts to estimate the Equilibrium Climate Sensitivity (ECS), which is the global average surface temperature change due to a doubling of carbon dioxide concentrations. This will occur after the oceans reach a temperature equilibrium, which takes about 1500 years.
The IPCC, in its AR5 report, estimated that the ECS is 3.0, meaning that the effect of doubling CO2 concentrations by about the year 2100 would be an increase average global temperatures of 3.0 degrees C. , with an uncertainty range of 1.5 degrees C to 4.5 degrees C. Actual temperature changes could be higher or lower than that due to the effects of natural cycles.
Climatologists Nicholas Lewis and Dr. Judith Curry published a paper in 2018 that used an observationally-based energy balance method to estimate the ECS. They estimated that the ECS was 1.50. More recent analyses, taking into account natural warming and the effects of urban heat islands, have lowered the estimated ECS to 1.04. How to determine the most likely ECS remains one of the central controversies in the climate change debate.
During the preparation of its Fifth Assessment Report in 2014, the Intergovernmental Panel on Climate Change (IPCC) decided to use four “Representative Concentration Pathways” (RCPs), meaning scenarios developed by modelers regarding carbon dioxide-equivalent concentrations in the earth’s atmosphere. The pathways were labelled RCP2.6, RCP 4.5. RCP6 and RCP8.5. They correspond to estimates of the ECS of 2.6, 4.5, 6.0 and 8.5, generally far above the climate sensitivity estimates from recent academic research. They were initially developed with the understanding that the scenarios, and the models that used them, were simply scientific tools aimed at exploring a variety of conditions as a way to test hypotheses and researchers’ understanding of the climate system.
The IPCC, however, associated the RCP scenarios with likelihoods of happening when it labeled the scenario leading to the largest amount of climate change, RCP8.5, as the single “business-as-usual” scenario of the set. As Roger Pielke has observed, “In so doing, the IPCC identified RCP8.5 as the most likely future in the absence of further policy intervention, which gave it special status among not only the RCPs but among the hundreds of baseline scenarios of the broader IPCC scenario database.”
The RCP 8.5 scenario was very different from the others. It was perceived as depicting the situation that might prevail “without climate policy”; in other words, if the countries of the world took no measures to reduce GHG emissions beyond those in place in 2014. RCP8.5 has annual carbon dioxide emissions more than tripling by century’s end, the concentration of carbon dioxide in the atmosphere soaring to more than 900 parts per million, and the radiative forcing (i.e. a scientific concept used to quantify and compare the external drivers of change to Earth’s energy balance) more than triple what it is today.
Some of the most prestigious experts, such as Dr. Judith Curry, have characterized RCP8.5 as clearly “implausible” implying that it has less than a 2% chance of occurring. It was undoubtedly intended by the modelers to be a “worst case”.
Without the attention and credibility given to RCP8.5, it is doubtful that a persuasive case could be made for high carbon dioxide taxes or “net-zero by 2050” policies.
RCP8.5 is a perverse corruption of climate science and an extraordinarily damaging propaganda tool used by climate campaigners. It does not even remotely provide a credible basis for projecting the future; its probability ranges between the implausible and the impossible. However, by the magic of misrepresentation and skillful public relations, the worst case became in the public’s mind the “most likely” case.
IPCC Modelling Performance
The climate models used by the IPCC have failed even to model accurately the temperature changes that have occurred since 1990. The IPCC models are “running too hot”. The average modelled global lower troposphere temperature trend from 1979 is 200% of the measured trends. The troposphere is the lowest part of the atmosphere where we live.
Nowhere in any of the IPCC science reports will you find a claim that the world is facing a climate catastrophe or that humanity’s future is in peril.
What ClimateData.ca Does
ClimateData.ca explicitly relies upon the IPCC’s modelling and scenarios. Indeed, it specifically quotes RCP8.5 as the basis for its projections. In other words, it perpetuates the myth that the worst case is the most likely case.
In the case of Ottawa, it notes that for the 1971-2000 period, the annual average temperature was 6.0 degrees C. and the average annual precipitation was 933 mm.
It then warns that under a high emissions scenario (i.e. RCP 8.5):
- Annual average temperatures will be 8.7 degrees C in the 2021-2050 period (i.e. 42% higher)
- Annual average temperatures will be 10.8 degrees C. in the 2051-2080 period (i.e. 60% higher)
- Annual average temperatures will be 12.6 degrees C. for the last 30 years of the century (i.e. 110% higher)
- Average annual precipitation will be 13% higher for the 2051-2080 period
- Average annual precipitation will ne 17% higher for the last 30 years of the century.
This is not data reporting. It is propaganda intended to serve a political purpose.