By Paul Homewood


This paper came out a couple of months ago, another one claiming to prove that global warming is leading to more extreme rainfall. It focusses on hourly rain data, and attempts to correlate this with dew point temperatures, a proxy of absolute humidity:

The first question is why don’t they simply monitor the hourly rainfall trends themselves, instead of using dew points, which are not accurate anyway?

The study uses 7000 rain gauges spread around the world, but England is one area focussed on. On the map below, the orange and red circles indicate where hourly precipitation has increased in line with the  Clausius‐Clapeyron equation (6.5%/K):

So according to the theory, Oxfordshire should be seeing much more extreme rainfall.

Hourly rainfall data is not readily available for most stations, nor is there much history. But the Met Office do have data for Benson, Oxfordshire back to 1975, available through their MIDAS system.

Each year is filed separately, so it is a bit of a ball aching task! But I have downloaded all of the data, and extracted the top 100 wettest hours (17.3mm and over):

You don’t need any fancy statistical tools to work out that hourly rainfall has not been getting more extreme. The top three were:

August 1977

September 1992

September 1980

With 100 events over the 43 years, the average is 2.32 pa, The last ten years has seen 19 of these, which is of course below the average.

This is only one station, but if the theory is correct, it should be apparent at all stations, including Benson.

It should also be pointed out that the daily data leads us to the same conclusion:

Regardless of the theories, I suspect what the actual data is telling us is that weather is always the dominant factor. By that I mean that rainfall is determined by the meteorological conditions, which are fundamentally random. Whatever effect climate change may or may not be having, it is miniscule and unmeasurable in comparison.


March 3, 2021 at 10:06AM