Surface Radiation Balance

Spread the love

Guest Post by Willis Eschenbach

Let me invite you to wander with me through some research I’ve been doing. I got to thinking about the surface radiation balance. The earth’s surface absorbs radiation in the form of sunlight plus thermal radiation from the atmosphere. On a global 24/7 basis, the surface absorbs a “downwelling” flow of about half a kilowatt of radiative energy per square meter.

I investigate science in pictures. The raw numbers mean little to me. So I made a global map of variations in the absorption of total energy. Figure 1 shows that result.

Figure 1. Average total radiation absorbed by the surface, shortwave + longwave, on a 1° latitude by 1° latitude gridcell basis.

As you might imagine, the most absorption is in the tropics. The least is up on the Antarctic Plateau. And you can see the light line above the equator in the Pacific. That’s the location of the ITCZ, the “intertropical convergence zone” where the atmospheric circulations of the northern and southern hemisphere meet. It’s an area that gets thunderstorms most days, which reflect lots of sunlight back out to space.

But that’s not what first caught my eye …

I often start researching my way down one road and then I get sidetractored into a different path … in this case I eventually noticed an oddity of that graphic. Most curiously, the total energy absorbed by the the surface in the two hemispheres is exactly the same …

… moments like this are when I’m happy that I’m an independent unfunded researcher, free to follow my monkey mind. So now, as I’m in the process of writing this up, I want to know if this exact equality between the total downwelling surface radiation in the northern and southern hemispheres is a coincidence, or whether it is stable over time. So hang on while I go take the annual averages … OK, Figure 2 shows the annual surface radiation absorption by hemisphere.

Figure 2. Annual total downwelling radiation absorbed by hemisphere, 2000 to 2020.

Now, this is most fascinating … despite the fact that the southern hemisphere has 30% more ocean than the northern, despite different cloud patterns and weather formations, despite one pole being a seasonally frozen ocean and the other a 5,000 foot (1,500 m) ice-covered stony plateau, despite the annual average amount absorbed in different places ranging from 120 to 670 W/m2, despite the hugely changing absorption due to the seasons, despite all of that, every single year the amount of energy absorbed by the surface is split evenly, to within half a percent, between the two hemispheres. Cries out for an explanation …

… my guess is that it’s the result of thermoregulatory emergent phenomena, but that’s a topic for another time. See, this is what my research is like. No straight lines. I prefer to wander side-trails I’ve never trodden. So I’m as surprised as you are that the absorption is almost exactly 50/50 north and south and that that is so stable … but I digress, that’s a topic for another day.

To return to the theme of the post, at the same time that the surface is absorbing half a kilowatt per square meter of downwelling radiation, the earth (like all solid objects and most gases) is also constantly emitting a certain amount of thermal “upwelling” longwave radiation. The amount that gets radiated is a function of temperature. So you can calculate the temperature from the amount of emitted radiation using something called the Stefan-Boltzmann equation. It’s how IR stand-off thermometers work.

The amount of radiation that is emitted is almost always smaller than the amount absorbed, because:

• some energy is lost from the surface as “sensible” (feel-able) energy

• some energy is lost to evaporation as “latent” energy in the form of water vapor, which releases energy when it condenses.

• some energy is “advected”, meaning moved horizontally from one point to another.

Figure 3 shows another global map, this time of upwelling thermal energy emitted from the surface.

Figure 3. Average upwelling thermal longwave radiation from the surface. Legend colors are the same as in Figure 1.

So in rough terms, the surface receives 500 W/m2 of downwelling radiation, and only emits about 80% of that, 400 W/m2, in the form of upwelling radiation. The rest leaves the surface as sensible and latent heat.

From this, an interesting question arises—if the absorbed radiation goes up or down by one W/m2, how much does the emitted radiation change? Naively we might assume that since the amount emitted is 80% of the amount absorbed, for each additional watt per square meter of absorbed radiation, we would expect an increase in emitted radiation of 0.8 W/m2 … but does it work like that?

There are several ways that we can look at this question. Let me start with a gridcell by gridcell time series analysis. Here are the 1° latitude by 1° longitude gridcell short-term (monthly) trends in 21 years of records of upwelling vs. downwelling radiation at the surface. This is looking at monthly changes in both variables after the removal of seasonal variations. Let me call this the “Monthly Analysis”.

Figure 4. Change in emitted surface thermal radiation for a 1 W/m2 change in absorbed surface radiation.

The most unexpected part of Figure 4 is that unlike the rest of the world, in the warmest areas of the ocean, when radiation absorbed at the surface goes up … emitted surface radiation goes down.

This is quite odd. If we take say a block of steel at steady-state, the more radiation it is absorbing, the more radiation it is emitting. And as we would expect, over most of the earth’s surface, and over all of the land, the absorbed radiation controls the emitted radiation. As is true for a block of steel, the more radiative energy that is absorbed, the more energy is radiated away. It’s just “Simple physics”, as mainstream climate scientists like to say.

But in the blue zones in Figure 4, the reverse is true—above a certain threshold, the more radiative energy that is absorbed, the less is radiated. “Complex physics”, as I like to say …

I say that this is because of the combined action of tropical cumulus fields and thunderstorms. These cool and remove energy from the surface in a host of ways, and in their wake leaving lowered temperatures and less energy to be radiated. Here’s a movie about how thunderstorms chase the hot spots.

Cloud top altitude (a proxy for the intensity of thunderstorms) and sea surface temperatures.

Thunderstorms are unique in that they are not simple feedback—particularly over the ocean they are able to drive surface temperatures down to below the thunderstorm initiation temperature. This accounts for the fact that as absorbed surface radiation increases in those tropical areas, the amount of radiation emitted decreases … but again I digress …

The reality is clear from Figure 4. We are looking at a very complex system. There is no one single simple linear relationship between emitted radiation and absorbed radiation. Depending on the location and the nature of the surface (land vs. ocean vs ice vs …), it ranges from an additional 0.8 W/m2 of surface emissions per one additional W/m2 absorbed, down to a decrease of – 0.2 W/m2 of emissions per one W/m2 of additional absorption. Not only are they not of even approximately the same size, they’re not even of the same sign.

As Figure 4 shows, the short-term area-weighted global average trend is only an increase of about a quarter of an additional watt per square meter of emitted radiation for each one W/m2 of additional radiation absorbed.

The various areas do act as we might expect. For example, there’s a greater response in land radiation emitted than in the ocean, due to the ocean’s temperature changing more slowly because of its greater thermal mass. And the tropics show the least change from a one W/m2 change in absorbed radiation.

But Figure 4 only shows short-term, month-by-month changes. And because it take time for earth and ocean to heat up and cool down, these short-term changes will be smaller than the long-term changes. For longer-term changes, we have to look elsewhere.

My insight on this was that we have some 64,800 gridcells in each of the maps above. Over the centuries, they’ve settled into a slow-changing steady state. Each of these has a long-term average absorption of total radiation, and a long-term average emission of thermal radiation. After many, many years, these absorption and radiation levels have incorporated and encompass all feedbacks and delayed responses. We know this because the average of the first half of the CERES data is almost indistinguishable from the average of the full dataset. Figure 1a below shows the same analysis as Figure 1, except for the first half of the data.

Figure 1a. As in Figure 1, but for the first half of the CERES data. As you can see, there’s almost no change from the full dataset. This shows we’re dealing with long-term steady-state analysis. The largest difference is about 0.5% in Antarctica.

So … consider several adjacent gridcells in mid-Pacific or somewhere. Each one has slightly different long-term averages of absorption and emission of radiation. This lets us know what we can expect to happen in that area of the world if the absorption changes by say a watt per square meter. We can know that because in some adjacent gridcell, this is actually happening.

For Figure 5, for every individual gridcell, I looked at a 5° latitude by 5° longitude square area, with the individual gridcell located at the center. Utilizing two of these 5×5 patches of cells, one patch showing emission and one showing absorption, I used standard linear regression to calculate the local change in emission from a one w/m2 increase in absorbed radiation for that small area. Since this is using local data from adjacent gridcells, let me call this the “Local Analysis”.

Figure 5. Local Analysis. Two views (Pacific and Atlantic centered) of a different way to measure the longer-term change in surface upwelling radiation as a function of the absorbed radiation.

There are some interesting things about Figure 5. First, overall, as we’d expect, over the longer term the surface responds more to absorbed radiation than is shown in Figure 4, which is looking at month-to-month variations. In Figure 5 it averages about 0.4 watts emitted per watt absorbed, almost double the value shown in Figure 4.

However, the areas of negative correlation (blue areas) are all in the same location, in the tropical ocean above and below the equator. And the ocean still is changing less than the land, and the tropics is still changing the least.

Moving forwards, there’s a third way and separate way to calculate the long-term response of the surface to absorbed radiation. This depends on a scatterplot of the amount emitted by the surface as a function of the amount absorbed, as shown in Figures 7 to 9.

To begin with, Figure 7 shows the scatterplot for the entire globe. Note that this uses exactly the same data as in the two previous analyses, the “monthly” and “local” analyses. In all cases, I’m using the CERES 21-year gridcell-by-gridcell average values for the surface radiation absorbed and emitted.

Figure 7. Gridcell by gridcell (red dots) scatterplot of radiation absorbed versus radiation emitted. Right-hand scale shows the temperature corresponding to the surface emission on the left-hand scale

Now, the general trend that we’ve been looking at in the two analyses above, the change in emission for a one watt per square meter increase in absorbed radiation, is given by the slope of the yellow line above. And it shows something quite curious …

Most of the data shows a pretty linear relationship between absorption and emission. From ~ 100 W/m2 to ~ 275 W/m2 of absorption, it’s pretty much a straight line. And the same is true, although with a somewhat lesser slope, of absorption from ~ 275 W/m2 to ~ 600 W/m2 of absorption

But once the average absorption (longwave plus shortwave) goes above ~ 600 W/m2, there is no further increase in emission. In other words, all of the additional incoming energy simply is lost as sensible, latent, and advected heat, and the emission doesn’t increase … and of course, “no increase in emission” means no increase in temperature.

Remembering that in the monthly and local analyses above, negative trends almost entirely occurred over the ocean, I split the data into land and ocean gridcells and looked at the two responses. Figure 8 shows the response over the land.

Figure 8. As in Figure 7, but for the land only.

Now, this is much more what one would expect to find. As in the aforementioned steel block, as absorption increases, emission increases. Everywhere we look on the land, when absorption goes up, emission (and thus temperature ) goes up. “Simple physics”.

But this is just the land … what about the ocean?

Figure 9. As in Figures 7 and 8, but for the ocean only.

Here we can see what we saw in Figure 1—there are areas of the ocean where, when the absorbed radiation increases, the emitted radiation decreases … which also means that the temperature is decreasing.

How much does this affect the trend worldwide? We can use the data above to get the trends for each gridcell with a given amount of absorbed radiation. Figure 10 shows that result. Let me call it the “Global Analysis”. Of course, since it is looking at global averages it doesn’t have the fine detail of the other methods.

Figure 10. “Scatterplot Method”, change in surface emission per one W/m2 increase in surface absorption

But it does give the same general pattern, with land emissions all increasing with increased absorption, and with large areas with tropical Pacific emissions moving in the opposite direction from the absorption.

So we have three different estimates of the changes in surface emission resulting from a 1 W/m2 increase in surface absorption. The first one, the “Monthly Analysis”, is short-term so it doesn’t include any feedbacks or slow changes. Thus it gives smaller results than the other two methods.

The other two include all of those slow changes, because they are based on two-decade averages showing the long-term steady-state conditions. Here is a comparison of the three methods.

Figure 11. Comparison of the three analysis methods discussed above.

As you can see, because it’s showing short-term variations, the monthly analysis (blue) gives smaller answers across the board. However, it is closer regarding the land trend, because the land changes temperature faster. The other two methods are long-term and are in reasonable agreement. I would say that the local analysis method is the more accurate of the two longer-term methods. It is location-specific as opposed to absorbed radiation-specific, and so it captures finer detail.

Steady-State

“Steady-state” describes a condition where variables don’t change much. For example, over the entire 20th century the globe warmed by something less than one kelvin. The average temperature of the planet is about 288 kelvin. So over the 20th century, the total change in temperature was about a third of one percent. This is steady-state, where on all levels energy absorbed is generally equal to energy out.

Suppose we took a cold world, dropped it into orbit around a nice warm sun, and watched what happened.

It would warm up … but it won’t warm up forever. The warmer it gets, the more loss from the surface occurs in the form of sensible, latent, and advected heat. Eventually, it will hit a balance point where the surface will neither heat nor cool appreciably.

For the earth, this occurs at the point where on average, for every watt per square meter absorbed by the earth’s surface, about 0.8 watts are emitted.

However, and this is the important point, for excursions around the steady-state, the surface emits much less radiation for every watt per square meter absorbed. After the effect of all short- and long-term losses and feedbacks, the surface only emits an additional ~ 0.4 watt/m2 for every additional watt/m2 absorbed.

Discussion

Why is all of this important? It’s the lowest-level, simplest, and most straightforward part of a larger question. That question relates to the central paradigm of mainstream climate science, which says that the change in temperature (∆T) is a linear function of the change in top-of-atmosphere (TOA) downwelling radiation (“radiative forcing”, ∆F). Mathematically, this is expressed as:

∆T (change in temperature) = lambda (“climate sensitivity” constant) times ∆Ftoa (change in downwelling radiation at top-of-atmosphere), or

∆T = λ ∆Ftoa

Me, I think this equation is fatally flawed, in part for the reason visible in the graphs above—even within just the surface itself, there is no constant lambda “λ” that relates radiation emitted (a measure of temperature) to radiation absorbed. Instead, it varies widely by location and surface type, in both the short- and long-term trends.

In fact, to the exact contrary of the idea of a linear relationship, in large parts of the tropical ocean when absorbed radiation goes up, emitted surface radiation (which is to say surface temperature) goes down … the climate, my friends, she is very complex, and not “simple physics” in the slightest.

More energy is absorbed by an object and in response, it cools down? Say what? “Complex physics” at its very finest.

My best to all,

w.

via Watts Up With That?

https://ift.tt/3tBN4DZ

September 12, 2021