THIS POST IS A TRANSCRIPT OF A YOUTUBE LECTURE BY STEVE KOONIN ON WHAT HE CALLS THE CONSENSUS SCIENCE OF CLIMATE CHANGE.

STEVE KOONIN is a well known skeptic of what he calls the „consensus science of climate change“. He is a Theoretical Physicist and Director of the Center for Urban Science and Progress at New York University. He is also a professor in the Department of Civil and Urban Engineering at NYU’s Tandon School of Engineering.

TRANSCRIPT OF THE LECTURE BY KOONIN

I have been involved in the discussion about climate and energy for 30 years. It is unusual for science to have consensus but there is a consensus in climate science. That consensus is found in a series of periodic reports that the UN IPCC puts out. There was also something called a „science report“ and an „impacts report“ published by the US government in 2017 as part of the US climate assessment report#4. The objective is to summarize the state of climate science and the impacts of a „changing climate“ under a set of human and natural influences. A summary was put out by one of the primary authors Katharine Hayhoe who had tweeted „It’s real, it’s us, it’s serious, and the window of time to prevent dangerous impacts is closing fast (detection, attribution, projection, and response). However, there is a disconnect between what’s in the scientific literature and what’s in the report and what we hear in the political jargon and in the media coverage of climate change.

First of all let’s talk about the consensus on the science and cover the topics of Detection, Attribution (and climate models) and Projection, and some of the unsettled issues in the discussion of what the state of climate science is. I will also discuss what I think are some of the unresolved issues in climate science. We will then go on to observed and projected impacts almost exclusively from the reports themselves, and then we will go on to the possible responses to the changing climate – again with a set of buts. Let me insert a caveat. This is a complex and nuanced subject. I can spend a whole term talking about this. So what I can present in an hour will sacrifice some completeness and also keep things pretty short. But I will try to give you a sense of what I think is really going on and what the reports actually say.

DETECTION: So let’s start with detection. The reports say that we have seen an unusual rise in global mean surface temperature (GMST) in recent years. Shown the left in image below is a figure from the US Govternment report showing the GMST rising steeply to about 1C of warming since the early part of the 20th century. We can see in the chart that the warming is not monotonic and these variations derive from natural forcings that tend to be more random. As for example, GMST actually dropped from 1950 to 1970. So there are other things involved and not just human influence.

The figure on the right side shows that the warming is pretty uniform over the globe but with some regional differences – as for example the polar regions warming faster than the tropics as we would expect. Non-temperature indications of the warming is found in rising sea levels, melting glaciers, and shrinking polar ice sheets.

We can also see that it is not just human influence in the chart below where the pre-human influence chart on the right looks a lot like the chart on the left taken from a period of human influence. And in fact if you calculate the slope what you find in the 30-year trailing trends is that the 40-year rate of warming was just about as high in 1940 as it is in recent times. That is not to say that there is no human influence on the warming but that there are also natural influences.

A chart of temperatures over geological time is presented in the image below. Note that in the prior interglacial, the Eemian, the earth was much warmer than it is today without human influence about 130,000 years ago. What this chart shows is that there are powerful natural forces that change earth’s temperature and climate in significant ways and these must be taken into account in the attribution of climate change to humans.

The USA climate report says that humans afre responsible for at least half the warming since 1951 and then they go on to say that humans are responsible for all the warming we have seen since 1951. The basic argument for pinning it all on humans over the last 40 years is that there is no other cause that we can think of that could be responsible for the warming that we see.

So let’s look at these human influences, how big they are and how they change with time. This is the record of atmospheric CO2 concentration over the last 2000 years and we see that spike in the last century or century and a half that is clearly human. We are burning fossil fuels and that is causing atmospheric CO2 to go up from 280ppm to the current value above 400ppm rising at a rate of 2 to 2.5 ppm per year.

So what does that growing concentration of CO2 in the atmosphere do to the climate system? To answer that question we need to look at how energy flows in the climate system. As we can see in the diagram below, it is a complicated system wth lots of arrows indicating flows of energy. The big ones are that the solar energy comes in, about a third of it gets reflected, the rest of it is absorbed by the surface, the surface then re-radiates that tnergy as infrared, some of it gets trapped in the sky and gets radiated down again, and so in the end whatever you need to to balance the input comes out as infrared radiation leaving the system. Lots of pieces, numbers, of about a couple hundred. The carbon dioxide influences this system by adding about 2 or 2.5 watts per square meter to the energy balance. So it’s a relatively small perturbation {less than 1%} on this compicated system that we are trying to understand.

The graphic below shows the different human influences in terms of both warming and cooling. CO2 warms, AEROSOLS cool. The biggest human influence is CO2 which accounts for about 2.5 watts/m2 and some other greenhouse gases such as methane and halocarbons that add up to about 1.5 watts/m2 of forcing and then there are some negative influences such as aerosols that cause cooling by about 1 watt/m2. The net is about 2.5 watts/m2 of warming with large uncertainties. So we need to understand how this 2.5 watts/m2 is changing the climate system in order to attribute observed changes to human influences and to project changes in the climate system out to the end of the century. .

Here is how it goes quantitatively over the last 250 years. The human influence is in the orange line and there are periodic volcanic eruptions that cool the planet with aerosols. The important information in this chart is that prior to 1950 the human influence was about 0.5 watts/m2 and currently it is 2.5 watts/m2. So therefore, the climate before 1950 can be described as entirely natural but after 1950 the climate has at least a human component if not a human dominance.

The chart below shows these radiative forcings over time from 1750 to 2011 where we see that the steep rise in temperature begins after 1950. Anything before 1950 we can assume is natural and anything after 1950 has a human component of some kind.

Let me tell you a little bit about climate models now as presented in the graphic below. To model the climate you have to model the atmosphere and the ocean. The way that is done is to cut both of them into cubes or maybe rectangles typically 100km per side (whether square or cube). And many layers going up into the atmosphere and down into the ocean – let’s say 20 each. And then use the basic laws of physics to move air, water, and energy, radiation as well, through the system and then do it time step by time step, say 6 hours at a time over a period of 200 years in about 300,000 6-hour steps. And do this twice, first under natural variability and then with the external influence being tested, in our case, humans burning fossil fuels. This modeling exercise is probably the most challenging problem in computational science. The consensus report is that the models are useful but that they are imprefect. This contradiction is an unresolved issue in computional science and a subject of dispute between climate science and its skeptics. But this is the best we have and it is improving all the time. The real question in the climate action issue is whether the models are good enough to be able to usefully understand and project and on that basis to demand a costly overhaul of the world’s energy infrastructure.

The claim that the models are right because they are physics does not take sub grid scale phenomena into consideration. Consider the satellite photo of the Gulf Stream shown below. Color coded here is the temperature of the surface of the ocean (SST) – and you can see all of that wonderful structure at scales of less than 100km. And there are other phenomena in the climate system that happen at physical scales of much less than the 100km of the grid box. So you have to make assumptions about that because you can’t treat it explicitly in the computer. For example, given the average temperature and humidity in a grid box, what are the clouds like? How many clouds are there? And at what altitude? What fraction of the sunlight do they reflect? So you need a lot of assumptions to make sense out of the climate model and so much depends on the user’s expertise and that is why it is as much art as it is science. Putting these small scale phenomena into the model is absolutly crucial and yet it is not entirely in the model’s control but something in which the user plays a role. It is something that is a big uncertainty in the model. And interestingly, the models fail to introduce important aspects of the climate at the scale of the phenomena that we are trying to understand. Another issue, in addition to clouds, is convection. Some of the issues include the shape of the grids. They aren’t really cubes but more like pancakes because these 100kmx100km squares are only about half a kilometer thick and that makes it very difficult to describe the vertical convection that’s so important in the atmosphere transfering heat and water vapor.

About the ocean initial conditions. We didn’t really know what the state of the ocean was 30 years ago. It was only in the last 10 years with all those buoys that dive down to 2km depth and measure the temperature and currents and salinity. As I showed you, greenhouse gases warm the planet but aerosol tends to cool and you can build a model that is very sensitivie the greenhouse gases and very sensitive to aerosols or not very sensitive to either one and they’ll give you the sdame answer for current conditions but bery different projections about what will happen over the next hundred years as greenhouse gases accumulate.

And finally, the climate system has long term coherences that the models do not reproduce very well. Let me show you just how badly the models would do some aspects of the climate. Shown in the charts below is the global temperature modeled over the last 100+ years. The black line is the observations. The spaghetti lines are all of these different models that are used by the IPCC. And this is really misleading because in fact the average temperature differs among the models by as much as 3C. but they’ve been adjusted for the differences to line up to zero for the period 1950-1980. So in fact when the models differ by 3C it should give you pause on whether you got it right.

The charts below show climate model temperature over the last hundred and some years The black lines are the summations. The spaghetti lines are from the models used by the IPCC. This is really misleading because in fact the average temperature for each model differs by as much as 3C. They’ve all been adjusted to line up to zero for 1950 to 1980. Multi-decadal variability is really important but it’s not in the models. These phenomena are part of the climate variability. That the models can explain and describe the climate and yet is missing these two important drivers of climate is an unexplained puzzle.

Here’s an example of something called the Pacific Decadal Oscillation. This is a real phenomenon. You can see the two different modes of the PDO and their temperature differences. The chart below shows the actual temperature record from observations. This is a long term mode in the system as it has to do with energy sloshing around in the Pacific Ocean. There is a similar mode in the Atlantic called the AMO. The models do not have this, at all but they are somehow able to to reproduce the climate system without the use of these known climate drivers. And that raises some questions about how real our understanding of the climate really is.

In the graphic below, the chart on the left is the consensus global mean temperature time series over the period of anthropogenic global warming with both natural and anthropogenic forcings and the chart shows a good match between climate model projections and observations. The chart on the right is the same as the chart on the left but with the CO2 forcing removed and it shows that without the CO2 forcing there is no match between climate models and observations.

This comparison is presented as evidence that the current warming rate is a creation of fossil fuel emissions. Yet the real reason for the cose match between models and observations in the left chart is that the models have been „TUNED“ to observations meaning that the forcings were inerpreted in the context of observations.

The graphic below overcomes the tuning by comparing 30 year trnds in a moving 30-year window in the right frame with data in blue and model predictions in brown. This comparison does not support the thesis that the model results match observations.

CMIP5 Model results versus Lower Troposphere data:

Forecasting future climate change with RCPs (representiative concentration pathways)

The ensembles of models uses a large number of models (40 to 50) and a number of runs of each model model runs.

These climate forecasts use forecasts of global population and GDP over the time period of the RCP going out to 2100.

And then when you do that for different rates of fossil fuel emissions, you get RCP forecasts that look like this. In some scenarios emissions continue to rise but at different rates while in others they peak early and decline later and these curves are consistent with historical observational data but with different projections of what will happen in the future based on climate models that have been verified against the observational data of the past. And as expected, high emission scenarios show higher temperatures in the future than low emission scenarios. The lower curves in the chart that rise at first and then flatten out are the ones where later climate action reduces emissions to zero and the temperature curve flattens out with no warming seen in the future. The plausibility of the assumptions in these stabilized curves is another matter altogether and something we should discuss given future projections for population and economic growth in the Global South as in China and India.

Now we move on to climate change impacts. The consensus is that climate impacts are bad and they will only get worse in the future in the absence of effective climate action but can be eliminated with the effective climate action. These forecsts contain extreme details about the type and extent of impacts as a function of temperature as shown in the graphic below. Essentially the global mean surface temperature (GMST) is used as a proxy for impacts where gmst deermines not only the type of impact but its intensity. These are the projections that are used to determine what the safe amount of warming since pre-industrial is. The safe abount of warming since pre-industrial used to be 5C but has been gradually lowered over the years to 4C then to 3C then to 2C and finally in 2018 to 1.5C since pre-industrial of which we have used up a little more than 1C. This threshold, though a critical determinant in climate science for impacts, is ill defined and not well understood. For example, when the 2C threshold was announced about a decade ago, no reason was given for that determination. It was just a number validated only by the fact that it has been announced by the IPCC and yet the same IPCC later lowered it to 1.5C that served as an admission that the 2C was in error. The arbitrariness of these changing thresholds does not imply reliability. When asked about the 2C the response was „Well, it’s a number that’s easy to remember and it’s about right or maybe on the safe side of right“. These „do not cross“ warming targets are fuzzy.

The bottom line on impacts is this: In the reports, what the climate science consensus says is muddled.

This muddled logic goes something like this: For some climate observable x, the data show that x has changed in recent decades. The climate has also changed in recent decades. Humans have influenced the climate. Therefore humans have influenced x. Therefore, our understanding of x has improved (an unsupported claim). If the human impact of x is uncertain or not yet directly detected, it is because of insufficient data, natural variability, and confounding influences, or models not in agreement, but it just has to be there or at least we can’t risk the assumption that it isn’t there. So therefore, as gmst rises, human impact on x will grow although there are uncertainties in the details – but what we know for sure is that it’s going to get worse because it can only get worse. The degree of confidence can vary depending on what x is. It is interesting to read these reports.

The IPCC, writing in AR5 WG1 Chapter 2, does not appear to show a great deal of confidence in these extreme weather impact forecasts as seen in the extracted text below. The media’s obsession with attributing all bad weather events to AGW climate change is not credible and must be understood purely as activism.

In the graphic below are some data on weather extremes from the US national climate report. It shows that cold extremes have declined but no trend is seen in the warm extremes. There may be a role for agricultural intensification in these data, and some of it is just natural variability.

The data for sea level rise is presented in the graphic below. The left frame shows global mean sea level reconstruction since 1900 derived from a network of tide guages around the world. It shows sea level rise of about 200mm over a period of 100 years at an average rate of about 2mm per year. On the right frame it shows satellite data for sea level since 1992. The satellite data global mean sea level rise are consistent with the reconstruction from tidal guages shown in the left frame.

The significant sea level rise events at the end of the last glaciation into the Holocene interglacialis shown in the chart below. It shows that sea level rise predates the industrial revolution. The sea level has been rising for 15,000 years and has risen by 120 meters (120,000 mm or an average of 8mm/year) and the rate of rise has not gone up but rather the rate of rise has gradually flattened out to a negligible percentage of the sea level rise at the initiation of the Holocene. This history makes it difficult to attribute the observed sea level rise in the post industrial era to humans.

Here is how the consensus climate science looks at this issue. The essential argument here is that current observed sea level rise is faster than what we would expect from an extrapolation of the historical deglaciation sea level and that difference is human caused. They say that the sea level rise in the most recent century is greater than the previous century and that therefore it is human caused.

Here Koonin says that if you look at the data since 1950 when NASA GISS and James Hansen identifiy as the time that global warming started, what we find is shown in the graphic below. It shows that natural variability in sea level rise is too large to intepret the tail end of it as a human caused sea level rise.

Projections of future sea level rise. On the left frame in the graphic below we see the projections of sea level rise where future rates of rise are much higher and accelerating. Comparing that SLR behavior with observations in the right frame we find no evidence of such acceleration in a time of global warming.

The graphic below shows Tropical cyclone data mislabeled as „hurricane“. The top frame of the graphic shows global tropical cyclone frequency in the two categories of „all“ and „major“ while the lower frame shows ACE (accumulated cyclone energy). These data do not show a rising trend that is assumed in the climate change consensus and media alarm about the impact of AGW on tropical cyclones. This conclusion is supported by the IPCC’s „low confidence“ and the NOAA’s „premature“ assessments on the matter of the impact of global warming on rising intensity and destruction of tropical cyclones. The media’assessment of this issue is dramatically inconsistent with these findings.

On the matter of agriculture and the projection of its devastation by AGW, we find that the opposite is true, that CO2 fertilization is making the world greener and making agriculture more productive, as seen in the graphic below.

On the matter of the impact of global warming on the US economy, the claim that global warming will damage the US economy is not found in the IPCC assesment.

via Thongchai Thailand

Posted by: chaamjamal on: March 22, 2021