A recent Rutgers University press release, says the rate of sea-level rise in the 20th century along much of the U.S. Atlantic coast was the fastest in 2,000 years, with southern New Jersey recording the fastest rate of rise. Real-world data suggests Rutgers’ claim is false.
The global rise in sea level from melting ice and warming oceans from 1900 to 2000 led to a rate that’s more than twice the average for the years 0 to 1800 – the most significant change, according to the study in the journal Nature Communications.
The study, for the first time, looked at the phenomena that contributed to sea-level change over 2,000 years at six sites along the coast (in Connecticut, New York City, New Jersey and North Carolina) using a sea-level budget.
Using a statistical model, scientists developed sea-level budgets for six sites … f[inding] that regional land subsidence – sinking of the land since the Laurentide ice sheet retreated thousands of years ago – dominates each site’s budget over the last 2,000 years. Other regional factors, such as ocean dynamics, and site-specific local processes, such as groundwater withdrawal that helps cause land to sink, contribute much less to each budget and vary over time and by location.
The data shows the rate of sea level rise in Atlantic City, New Jersey has essentially been constant since 1910, indicating rising levels of carbon dioxide (CO2) have had little if any effect on sea level rise there. What New Jersey has experienced is the result of natural global warming since the Little Ice Age ended combined with regional land subsidence. As a result there has been no accelerated pace of rise as would be expected if sea level rise was being driven by warming associated with accelerating CO2 emissions. Figure 1, below, shows this.
The study claims that 20th century sea level rise is double that of the years from 0 to 1800. But the climate models used by the researchers fail to account for fluctuating trends across the century and at the decadal scale within the period. We know for a fact glaciers significantly advanced between the 17th and 19th centuries. With more sea water being locked in ice, there was a significant reduction in sea level rise during that time, possibly even a decline in sea levels.
When the Little Ice Age ended in the mid-19th century, and we returned to a warmer climate naturally, sea levels began to rise again.
The news gets worse. Comparing actual sea level rise with official climate model projections provided by the U.S. National Oceanic and Atmospheric Administration (NOAA) in the graphic (Figure 2 below), we discover climate models are projecting a rate of rise that hasn’t happened.
The data show sea level rise in Atlantic City is having trouble keeping up with even the lowest climate model scenario.
The researchers behind the Nature Communications study claim “climate change,” driven by man-made CO2 emissions, has made the rates of sea level rise accelerate. As with many climate studies, the objective seems to be to prove climate change is “worse than we thought,” to make people clamor for action to prevent climate change.
Actual data versus modeled projections suggests there’s absolutely nothing to worry about. Seas are rising as they always have and society can manage and/or adapt to it, as it always has.
The post No, Rutgers, Sea-Level Rise Isn’t Accelerating on the U.S. East Coast appeared first on Climate Realism.
via Climate Realism
By Anthony Watts -June 15, 2021