Battery Issues Continue To Plague Electric Cars…BMW Orders “Large-Scale” Recall Of Plugin Hybrids

Technical snafus lead to mass recall of BMW hybrid vehicles. Image: Mario Roberto Durán Ortiz jCreative Commons Attribution-Share Alike 4.0 International license.

Explosive car batteries? BMW recalls large number of electric hybrid cars

By A.R. Göhring
(Translated by P. Gosselin)

Reader M.P. points out how ‘auto motor sport’ and other magazines are reporting that BMW is recalling its plugin hybrids on a large scale. What is the problem?

Since August this year, BMW has been recalling its plug-in hybrid electric vehicle (PHEV) models X1 to 3 and X5, 3 Series, 3 Series Touring, 2 Series Active Tourer, 7 Series, 5 Series (incl. Touring) and ‘Mini Countryman’.

The reasons given are production errors during welding and impurities, some of which may cause short circuits in the high-voltage battery (i.e. the traction battery). Production batches from the period January 20 to September 18 are partially affected.

In addition to the recall, there is also a delivery stop. More than 25,000 cars are affected worldwide, 8,000 of which are in the hands of customers (Germany: 5,300/1,800). The models already sold currently may not be charged; during trips  only with restrictions. This should not be a problem, as many customers do not load their company cars off the grid.

At the end of October, the Federal Motor Transport Authority will start a check of the cars sold. The procedure will take about 30 minutes without repair, if necessary.

Not only BMW has the PHEV problem. Ford, too, has already had to call back its Kuga model. The reason was a memory error with fire hazard, which prohibited recharging the battery.

Read more: europe.autonews.com/hybrids-fire-risk-samsung-battery.

Donate – choose an amount

via NoTricksZone


October 27, 2020 at 12:16PM


Climate Hysteria Not Grounded in Science

The iconic Metronome clock in New York City was repurposed as an 80-foot-wide climate clock that shows our remaining time to take urgent action on climate change. (photo credit: BEN WOLF)

Glenn T. Stanton writes at The Federalist New Data Shows Climate Change Hysteria Isn’t Grounded In Science.  Excerpts in italics with my bolds.

While we must steward the planet God has gifted to us, there is no empirical basis for apocalyptic predictions of impending doom.

The “Climate Clock” looms ten stories above Manhattan’s Union Square so all passersby can track the precise moment the world passes its supposed tipping-point toward irreversible, apocalyptic environmental demise. This clock has that moment of doom pegged at a little more than seven years from today. One of the men who created the clock, artist Gan Golan, said his motivation for the project was the birth of his daughter two years ago.

“This is arguably the most important number in the world,” the team explained to The New York Times, adding, “You can’t argue with science, you just have to reckon with it.” And that is where the problem lies with the environmental doom and gloom — you can absolutely argue with science. That is precisely what the scientific method is: the careful, relentless discipline of skepticism and discovery. It’s testing and questioning what others claim is beyond debate.

How many times was Doomsday predicted but failed to happen at midnight.

How many times was Doomsday predicted but failed to happen at midnight.

Nine leading climate scientists from Germany, France, Finland, and Ireland have, indeed, questioned whether anyone can reliably determine how much time remains between now and an irreversible trajectory toward environmental ruin.

Drawing from 36 different meta-analyses on the question, involving more than 4,600 individual studies spanning the last 45 years, their findings were recently published in the journal Nature Ecology and Evolution. They conclude that the empirical data doesn’t allow scientists to establish ecological thresholds or tipping points. As natural bio-systems are dynamic, ever-evolving, and adapting over the long-term, determining longevity timeframes is currently impossible.

These scholars write that frankly, “we lack systematic quantitative evidence as to whether empirical data allow definitions of such thresholds” and “our results thus question the pervasive presence of threshold concepts” in environmental politics and policy. Their findings also reinforced the contention that “global change biology needs to abandon the general expectation that system properties allow defining thresholds as a way to manage nature under global change.”

Professor José M. Montoya, one of the nine authors and an ecologist at the Theoretical and Experimental Ecology Station in France, told the French National Center for Scientific Research “many ecologists have long had this intuition” that setting reliable, empirically situated tipping-points “was difficult to verify until now for lack of sufficient computing power to carry out a wide-ranging analysis.” But that has now changed.

So no, there is no reliable science behind the new seven-years-to-the-point-of-no-return countdown of the Climate Clock in Union Square, nor for Rep. Alexandria Ocasio-Cortez’s infamous “The world is going to end in 12 years if we don’t act now” scare, or Thunberg’s just-10-years-til-inevitable-doom drum pounding. Such claims simply do not — and cannot — be firmly grounded in any scientific knowledge we currently possess.

Evidence for this conclusion, however, goes beyond the aforementioned conclusive new study. 2020 saw the publication of two extremely important books from leading, mainstream environmental-climate scholars on what science says about the earth’s future.

The first is Michael Shellenberger, a Time magazine “Hero of the environment” who explains in his book “Apocalypse Never: Why Environmental Alarmism Hurts Us All” that nearly every piece of scare data presented by the likes of AOC, Leonardo DiCaprio, and Thunberg is not only incorrect but tells a story that is the opposite of the scientific truth. Not only is the world not going to end due to climate change, but in many important ways, the environment is getting markedly better.

Another major environmentalist voice challenging hysteria is Bjorn Lomborg of the Copenhagen Consensus Center think tank, listed by the UK’s liberal Guardian newspaper as one of the 50 people who could save the planet. In his book “False Alarm,” he explains how “climate change panic” is not only unfounded, it’s also wasting trillions of dollars globally, hurting the poor, and failing to fix the very problems it warns us about.

So, what science genuinely telling us? “Science shows us that fears of a climate apocalypse are unfounded.” Lomborg explains, admitting that while “global warming is real … it is not the end of the world.” “It is a manageable problem” he adds. He is dismayed that we live in a world “where almost half the population believes climate change will extinguish humanity” and do so under the mistaken assumption that science concludes this. It doesn’t, and he is vexed this mantra parades under the banner of enlightenment.

It’s imperative we properly steward this beautiful planet God has gifted to us. It was the second command He gave to humanity, after the charge to populate it with generation after generation of new people. But hysteria is not what is called for in this work. Shellenberger, Lomborg, and these nine other international ecologists tell us that not only is there no empirical basis for the apocalyptic prognostications so needlessly disturbing the dreams of the world’s young people.

This is your brain on CO2 hysteria. Just say no!


via Science Matters


October 27, 2020 at 09:17AM


CMIP6 Update

26 October 2020

by Pat Frank

This essay extends the previously published evaluation of CMIP5 climate models to the predictive and physical reliability of CMIP6 global average air temperature projections.

Before proceeding, a heartfelt thank-you to Anthony and Charles the Moderator for providing such an excellent forum for the open communication of ideas, and for publishing my work. Having a voice is so very important. Especially these days when so many work to silence it.

I’ve previously posted about the predictive reliability of climate models on Watts Up With That (WUWT), hereherehere, and here. Those preferring a video presentation of the work can find it here. Full transparency requires noting Dr. Patrick Brown’s (now Prof. Brown at San Jose State University) video critique posted here, which was rebutted in the comments section below that video starting here.

Those reading through those comments will see that Dr. Brown displays no evident training in physical error analysis. He made the same freshman-level mistakes common to climate modelers, which are discussed in some detail here and here.

In our debate Dr. Brown was very civil and polite. He came across as a nice guy, and well-meaning. But in leaving him with no way to evaluate the accuracy and quality of data, his teachers and mentors betrayed him.

Lack of training in the evaluation of data quality is apparently an educational lacuna of most, if not all, AGW consensus climate scientists. They find no meaning in the critically central distinction between precision and accuracy. There can be no possible progress in science at all, when workers are not trained to critically evaluate the quality of their own data.

The best overall description of climate model errors is still Willie Soon, et al., 2001 Modeling climatic effects of anthropogenic carbon dioxide emissions: unknowns and uncertainties. Pretty much all the described simulation errors and short-coming remain true today.

Jerry Browning recently published some rigorous mathematical physics that exposes at their source the simulation errors Willie et al., described. He showed that the incorrectly formulated physical theory in climate models produces discontinuous heating/cooling terms that induce an “orders of magnitude” reduction in simulation accuracy.

These discontinuities would cause climate simulations to rapidly diverge, except that climate modelers suppress them with a hyper-viscous (molasses) atmosphere. Jerry’s paper provides the way out. Nevertheless, discontinuities and molasses atmospheres remain features in the new improved CMIP6 models.

In the 2013 Fifth Assessment Report (5AR), the IPCC used CMIP5 models to predict the future of global air temperatures. The up-coming 6AR will employ the up-graded CMIP6 models to forecast the thermal future awaiting us, should we continue to use fossil fuels.

CMIP6 cloud error and detection limits: Figure 1 compares the CMIP6-simulated global average annual cloud fraction with the measured cloud fraction, and displays their difference, between 65 degrees north and south latitude. The average annual root-mean-squared (rms) cloud fraction error is ±7.0%.

This error calibrates the average accuracy of CMIP6 models versus a known cloud fraction observable. Average annual CMIP5 cloud fraction rms error over the same latitudinal range is ±9.6%, indicating a CMIP6 27% improvement. Nonetheless, CMIP6 models still make significant simulation errors in global cloud fraction.

Figure 1 lines: red, MODIS + ISCCP2 annual average measured cloud fraction; blue, CMIP6 simulation (9 model average); green, (measured minus CMIP6) annual average calibration error (latitudinal rms error = ±7.0%).

The analysis to follow is a straight-forward extension to CMIP6 models, of the previous propagation of error applied to the air temperature projections of CMIP5 climate models.

Errors in simulating global cloud fraction produce downstream errors in the long-wave cloud forcing (LWCF) of the simulated climate. LWCF is a source of thermal energy flux in the troposphere.

Tropospheric thermal energy flux is the determinant of tropospheric air temperature. Simulation errors in LWCF produce uncertainties in the thermal flux of the simulated troposphere. These in turn inject uncertainty into projected air temperatures.

For further discussion, see here — Figure 2 and the surrounding text. The propagation of error paper linked above also provides an extensive discussion of this point.

The global annual average long-wave top-of-the-atmosphere (TOA) LWCF rms calibration error of CMIP6 models is ±2.7 Wm⁻² (28 model average obtained from Figure 18 here).

I was able to check the validity of that number, because the same source also provided the average annual LWCF error for the 27 CMIP5 models evaluated by Lauer and Hamilton. The Lauer and Hamilton CMIP5 rms annual average LWCF error is ±4 Wm⁻². Independent re-determination gave ±3.9 Wm⁻²; the same within round-off error.

The small matter of resolution: In comparison with CMIP6 LWCF calibration error (±2.7 Wm⁻²), the annual average increase in CO2 forcing between 1979 and 2015, data available from the EPA, is 0.025 Wm⁻². The annual average increase in the sum of all the forcings for all major GHGs over 1979-2015 is 0.035 Wm⁻².

So, the annual average CMIP6 LWCF calibration error (±2.7 Wm⁻²) is ±108 times larger than the annual average increase in forcing from CO2 emissions alone, and ±77 times larger than the annual average increase in forcing from all GHG emissions.

That is, a lower limit of CMIP6 resolution is ±77 times larger than the perturbation to be detected. This is a bit of an improvement over CMIP5 models, which exhibited a lower limit resolution ±114 times too large.

Analytical rigor typically requires the instrumental detection limit (resolution) to be 10 times smaller than the expected measurement magnitude. So, to fully detect a signal from CO2 or GHG emissions, current climate models will have to improve their resolution by nearly 1000-fold.

Another way to put the case is that CMIP6 climate models cannot possibly detect the impact, if any, of CO2 emissions or of GHG emissions on the terrestrial climate or on global air temperature.

This fact is destined to be ignored in the consensus climatology community.

Emulation validity: Papalexiou et al., 2020 observed that, the “credibility of climate projections is typically defined by how accurately climate models represent the historical variability and trends.” Figure 2 shows how well the linear equation previously used to emulate CMIP5 air temperature projections, reproduces GISS Temp anomalies.

Figure 2 lines: blueGISS Temp 1880-2019 Land plus SST air temperature anomalies; red, emulation using only the Meinshausen RCP forcings for CO2+N2O+CH4+volcanic eruptions.

The emulation passes through the middle of the trend, and is especially good in the post-1950 region where air temperatures are purportedly driven by greenhouse gas (GHG) emissions. The non-linear temperature drops due to volcanic aerosols are successfully reproduced at 1902 (Mt. Pelée), 1963 (Mt. Agung), 1982 (El Chichón), and 1991 (Mt. Pinatubo). We can proceed, having demonstrated credibility to the published standard.

CMIP6 World: The new CMIP6 projections have new scenarios, the Shared Socioeconomic Pathways (SSPs).

These scenarios combine the Representative Concentration Pathways (RCPs) of the 5AR, with “quantitative and qualitative elements, based on worlds with various levels of challenges to mitigation and adaptation [with] new scenario storylines [that include] quantifications of associated population and income development … for use by the climate change research community.

Increasingly developed descriptions of those storylines are available herehere, and here.

Emulation of CMIP6 air temperature projections below follows the identical method detailed in the propagation of error paper linked above.

The analysis here focuses on projections made using the CMIP6 IMAGE 3.0 earth system model. IMAGE 3.0 was constructed to incorporate all the extended information provided in the new SSPs. The IMAGE 3.0 simulations were chosen merely as a matter of convenience. The paper published in 2020 by van Vuulen, et al conveniently included both the SSP forcings and the resulting air temperature projections in its Figure 11. The published data were converted to points using DigitizeIt, a tool that has served me well.

Here’s a short descriptive quote for IMAGE 3.0: “IMAGE is an integrated assessment model framework that simulates global and regional environmental consequences of changes in human activities. The model is a simulation model, i.e. changes in model variables are calculated on the basis of the information from the previous time-step.

“[IMAGE simulations are driven by] two main systems: 1) the human or socio-economic system that describes the long-term development of human activities relevant for sustainable development; and 2) the earth system that describes changes in natural systems, such as the carbon and hydrological cycle and climate. The two systems are linked through emissions, land-use, climate feedbacks and potential human policy responses. (my bold)”

On Error-ridden Iterations: The sentence bolded above describes the step-wise simulation of a climate, in which each prior simulated climate state in the iterative calculation provides the initial conditions for subsequent climate state simulation, up through to the final simulated state. Simulation as a stepwise iteration is standard.

When the physical theory used in the simulation is wrong or incomplete, each new iterative initial state transmits its error into the subsequent state. Each subsequent state is then additionally subject to further-induced error from the operation of the incorrect physical theory on the error-ridden initial state.

Critically, and as a consequence of the step-wise iteration, systematic errors in each intermediate climate state are propagated into each subsequent climate state. The uncertainties from systematic errors then propagate forward through the simulation as the root-sum-square (rss).

Pertinently here, Jerry Browning’s paper analytically and rigorously demonstrated that climate models deploy an incorrect physical theory. Figure 1 above shows that one of the consequences is error in simulated cloud fraction.

In a projection of future climate states, the simulation physical errors are unknown because future observables are unavailable for comparison.

However, rss propagation of known model calibration error through the iterated steps produces a reliability statistic, by which the simulation can be evaluated.

The above summarizes the method used to assess projection reliability in the propagation paper and here: first calibrate the model against known targets, then propagate the calibration error through the iterative steps of a projection as the root-sum-square uncertainty. Repeat this process through to the final step that describes the predicted final future state.

The final root-sum-square (rss) uncertainty indicates the physical reliability of the final result, given that the physically true error in a futures prediction is unknowable.

This method is standard in the physical sciences, when ascertaining the reliability of a calculated or predictive result.

Emulation and Uncertainty: One of the major demonstrations in the error propagation paper was that advanced climate models project air temperature merely as a linear extrapolation of GHG forcing.

Figure 3, panel a: points are the IMAGE 3.0 air temperature projection of, blue, scenario SSP1; and red, scenario SSP3. Full lines are the emulations of the IMAGE 3.0 projections: blue, SSP1 projection, and red, SSP3 projection, made using the linear emulation equation described in the published analysis of CMIP5 models. Panel b is as in panel a, but also showing the expanding 1 s root-sum-square uncertainty envelopes produced when ±2.7 Wm⁻² of annual average LWCF calibration error is propagated through the SSP projections.

In Figure 3a above, the points show the air temperature projections of the SSP1 and SSP3 storylines, produced using the IMAGE 3.0 climate model. The lines in Figure 3a show the emulations of the IMAGE 3.0 projections, made using the linear emulation equation fully described in the error propagation paper (also in a 2008 article in Skeptic Magazine). The emulations are 0.997 (SSP1) or 0.999 (SSP3) correlated with the IMAGE 3.0 projections.

Figure 3b shows what happens when ±2.7 Wm⁻² of annual average LWCF calibration error is propagated through the IMAGE 3.0 SSP1 and SSP3 global air temperature projections.

The uncertainty envelopes are so large that the two SSP scenarios are statistically indistinguishable. It would be impossible to choose either projection or, by extension, any SSP air temperature projection, as more representative of evolving air temperature because any possible change in physically real air temperature is submerged within all the projection uncertainty envelopes.

An Interlude –There be Dragons: I’m going to entertain an aside here to forestall a previous hotly, insistently, and repeatedly asserted misunderstanding. Those uncertainty envelopes in Figure 3b are not physically real air temperatures. Do not entertain that mistaken idea for a second. Drive it from your mind. Squash its stirrings without mercy.

Those uncertainty bars do not imply future climate states 15 C warmer or 10 C cooler. Uncertainty bars describe a width where ignorance reigns. Their message is that projected future air temperatures are somewhere inside the uncertainty width. But no one knows the location. CMIP6 models cannot say anything more definite than that.

Inside those uncertainty bars is Terra Incognita. There be dragons.

For those who insist the uncertainty bars imply actual real physical air temperatures, consider how that thought succeeds against the necessity that a physically real ±C uncertainty requires a simultaneity of hot-and-cold states.

Uncertainty bars are strictly axial. They stand plus and minus on each side of a single (one) data point. To suppose two simultaneous, equal in magnitude but oppositely polarized, physical temperatures standing on a single point of simulated climate is to embrace a physical impossibility.

The idea impossibly requires Earth to occupy hot-house and ice-house global climate states simultaneously. Please, for those few who entertained the idea, put it firmly behind you. Close your eyes to it. Never raise it again.

And Now Back to Our Feature Presentation: The following Table provides selected IMAGE 3.0 SSP1 and SSP3 scenario projection anomalies and their corresponding uncertainties.

Table: IMAGE 3.0 Projected Air Temperatures and Uncertainties for Selected Simulation Years

Storyline1 Year (C)10 Years (C)50 Years (C)90 years (C)

Not one of those projected temperatures is different from physically meaningless. Not one of them tells us anything physically real about possible future air temperatures.

Several conclusions follow.

First, CMIP6 models, like their antecedents, project air temperatures as a linear extrapolation of forcing.

Second, CMIP6 climate models, like their antecedents, make large scale simulation errors in cloud fraction.

Third, CMIP6 climate models, like their antecedents, produce LWCF errors enormously larger than the tiny annual increase in tropospheric forcing produced by GHG emissions.

Fourth, CMIP6 climate models, like their antecedents, produce uncertainties so large and so immediate that air temperatures cannot be reliably projected even one year out.

Fifth, CMIP6 climate models, like their antecedents, will have to show about 1000-fold improved resolution to reliably detect a CO2 signal.

Sixth, CMIP6 climate models, like their antecedents, produce physically meaningless air temperature projections.

Seventh, CMIP6 climate models, like their antecedents, have no predictive value.

As before, the unavoidable conclusion is that an anthropogenic air temperature signal cannot have been, nor presently can be, evidenced in climate observables.

I’ll finish with an observation made once previously: we now know for certain that all the frenzy about CO₂ and climate was for nothing.

All the anguished adults; all the despairing young people; all the grammar school children frightened to tears and recriminations by lessons about coming doom, and death, and destruction; all the social strife and dislocation. All of it was for nothing.

All the blaming, all the character assassinations, all the damaged careers, all the excess winter fuel-poverty deaths, all the men, women, and children continuing to live with indoor smoke, all the enormous sums diverted, all the blighted landscapes, all the chopped and burned birds and the disrupted bats, all the huge monies transferred from the middle class to rich subsidy-farmers:

All for nothing.

Finally, a page out of Willis Eschenbach’s book (Willis always gets to the core of the issue), — if you take issue with this work in the comments, please quote my actual words.

via Watts Up With That?


October 27, 2020 at 08:50AM


Guardian Needlessly Alarmed By Late Freeze

By Paul Homewood

The Guardian is working itself up into a lather over the Arctic again!

For the first time since records began, the main nursery of Arctic sea ice in Siberia has yet to start freezing in late October.

The delayed annual freeze in the Laptev Sea has been caused by freakishly protracted warmth in northern Russia and the intrusion of Atlantic waters, say climate scientists who warn of possible knock-on effects across the polar region.

Ocean temperatures in the area recently climbed to more than 5C above average, following a record breaking heatwave and the unusually early decline of last winter’s sea ice.

The trapped heat takes a long time to dissipate into the atmosphere, even at this time of the year when the sun creeps above the horizon for little more than an hour or two each day.

Graphs of sea-ice extent in the Laptev Sea, which usually show a healthy seasonal pulse, appear to have flat-lined. As a result, there is a record amount of open sea in the Arctic.

“The lack of freeze-up so far this fall is unprecedented in the Siberian Arctic region,” said Zachary Labe, a postdoctoral researcher at Colorado State University. He says this is in line with the expected impact of human-driven climate change.

“2020 is another year that is consistent with a rapidly changing Arctic. Without a systematic reduction in greenhouse gases, the likelihood of our first ‘ice-free’ summer will continue to increase by the mid-21st century,’ he wrote in an email to the Guardian.

The warmer air temperature is not the only factor slowing the formation of ice. Climate change is also pushing more balmy Atlantic currents into the Arctic and breaking up the usual stratification between warm deep waters and the cool surface. This also makes it difficult for ice to form.

“This continues a streak of very low extents. The last 14 years, 2007 to 2020, are the lowest 14 years in the satellite record starting in 1979,” said Walt Meier, senior research scientist at the US National Snow and Ice Data Center. He said much of the old ice in the Arctic is now disappearing, leaving thinner seasonal ice. Overall the average thickness is half what it was in the 1980s.

The downward trend is likely to continue until the Arctic has its first ice-free summer, said Meier. The data and models suggest this will occur between 2030 and 2050. “It’s a matter of when, not if,” he added.


Let’s deal with a couple of points first:

1) As Walt Meier notes, all of these so-called “records” only date back to 1979, in the middle of the period when the Arctic was undergoing substantial cooling and a massive increase in sea ice extent, as HH Lamb observed:



HH Lamb: Climate, History & The Modern World

The idea that the 1970s and 80s represent some kind of norm, either in the short or long term, is unscientific and absurd.

2) The article also notes:

The warmer air temperature is not the only factor slowing the formation of ice. Climate change is also pushing more balmy Atlantic currents into the Arctic and breaking up the usual stratification between warm deep waters and the cool surface.

In fact, the influx of warmer Atlantic waters is key to the recent warming of the Arctic, just as it was in a similar period of Arctic warming between the 1920s and 50s,

It is that factor which is increasing air temperatures, and there is no evidence that this influx has been caused by global warming.

3) Once again, we see the nonsense about “ice free Arctics”, which keep getting put back another decade or two. Previous scares have not materialised, and these latest one won’t either for a very good reason. The Arctic is a very cold place from autumn through to spring when the sun goes down, and as a consequence there is always far too much sea ice around by June for it to melt away in the short Arctic summer.

Now to the current situation.

Ice growth has just begun in the Laptev, about a week later than last year:





However, if we compare the whole of the Arctic basin with the same date last year, we find that sea ice is much more extensive this year on the western side, off the Canadian coast,. Also ice is much thicker in the central Arctic currently than it was last year.

As a result, sea ice volume is actually up on last year:


In other words, swings and roundabouts.

One final consideration. At this time of year, virtually no heat from the sun enters the Laptev Sea. Instead, open seas mean that a lot of the heat escapes into the atmosphere, and thence lost to space.

Low ice extent in the Arctic actually cools the earth, not the opposite. It is one of the ways in which the earth’s climate regulates itself.



October 27, 2020 at 07:27AM




Winter is coming in hard and strong this year, and it’s taking names ACROSS the Lower-48. Hundreds of new low temperature records have been set over the past few days alone, but all have been eclipsed by the “biggie” set Sunday in Montana.

HUNDREDSof cold and snow records have fallen of late: from Texas to Montana, many of the lowest temperatures and the highest snowfalls ever recorded at this time of year are not only being broken, they’re being SMASHED.

Serving as just a few examples:

The National Weather Service reported two broken snowfall records at their Marquette office: “We recorded 8.3 inches [on Sunday], which breaks the old record of 3.1 inches set in 1976 [solar minimum of cycle 20]! This recent snowfall also established a new monthly snowfall record for the month of October at our office. Total snowfall recorded for the month stands at 19.2 inches! This breaks the old record of 18.6 inches set in 1979.”

Eastern Idahoans woke to bone-chilling weather Monday morning, reports eastidahonews.com. According to NWS data, Idaho Falls saw a low of just 1F, utterly shattering the previous record of 17F. In addition, Pocatello reached 3F, smashing its previous record low of 13F. The previous day, Sunday, also saw new record lows of 8 degrees in Idaho falls and 11 degrees in Pocatello.

For more:

Hundreds of All-Time Records Fall Across North America: “This cold weather is not normal!”


“Killing Freeze” hits Kansas as all-time Cold and Snow Records Fall

As detailed within the articles linked above, the record books have been rewritten from Texas to Montana–but it’s that latter state which claimed the “biggie” during the early hours of Sunday morning, October 25, 2020.

According to NWS data, and as reported by ABCnews.com (one of only a few MSM outlets covering this, but even they’ve buried it under the Cali wildfires)“the temperature in Montana fell to a record breaking 29 degrees below zero, the lowest temperature measured at an official climate station anywhere in the lower 48 states so early in the season in any year.”

That’s a rather ugly, long-winded sentence — so I’ll break it down for you: “the Grand Solar Minimum is upon us, so get your s**t together already!”

The Washington Post has since covered it too, to be fair (though they don’t run it as the headline), writing late Monday evening: “Temperatures throughout much of the Rockies dipped below zero to start the week, falling as low as minus-29.2 in Potomac, Mont., early Sunday — the coldest temperature ever observed this early in the season across the Lower 48.”




The WP calls the ongoing cold “off the charts, with an air mass more typical of December or January than late October.”

Corby Dickerson, a meteorologist at the NWS in Missoula, notes that the U.S. historical temperature database contains 14.5 million observations from Oct 1 to Oct 25, and that Potomac’s reading on Sunday morning was the coldest!

“It’s truly remarkable,” said Dickerson. “There’s no other way to describe it.”


Marias Pass Summit -19°F

-29°F: Potomac

-20°F: Marias Pass Summit

-19°F: Seeley Lake

-17°F: Polebridge

-17°F: Clearwater Junction

-13°F: Ronan Airport


A slew of other all-time benchmarks have come crashing down.

Two out of the past three mornings have been among Missoula’s coldest three in October, and more records are expected to fall later in the week.

“I’ve been describing it as a once-in-a-century event,” concluded Dickerson. But I’m not so sure it’ll be another 100 years before the return of such extreme LOW TEMPERATURES. Evidence is building to suggest the mid-latitudes are beginning to REFREEZE in line with historically low solar activitycloud-nucleating Cosmic Rays, and a meridional jet stream flow.

Both NOAA and NASA appear to agree, if you read between the lines, with NOAA saying we’re entering a ‘full-blown’ Grand Solar Minimum in the late-2020s, and NASA seeing this upcoming solar cycle (25) as “the weakest of the past 200 years”, with the agency correlating previous solar shutdowns to prolonged periods of global cooling here.

Furthermore, we can’t ignore the slew of new scientific papers stating the immense impact The Beaufort Gyre could have on the Gulf Stream, and therefore the climate overall.

Prepare accordingly— learn the facts, relocate if need be, and grow your own.

Social Media channels are restricting Electroverse’s reach: Twitter are purging followers while Facebook are labeling posts as “false” and have slapped-on crippling page restrictions.

Be sure to subscribe to receive new post notifications by email (the box is located in the sidebar >>> or scroll down if on mobile).

And/or become a Patron, by clicking here: patreon.com/join/electroverse, and/or consider “allowing ads” for http://www.electroverse.net if you use a blocker.

The site receives ZERO funding, and never has. So any way you can, help us spread the message so others can survive and thrive in the coming times.

Grand Solar Minimum + Pole Shift

The post The United States (Lower-48) just set its Coldest Temperature ever Recorded this early in the season appeared first on Electroverse.


Die Begutachtung der Berichte des Weltklimarats: Werden Kritiker überhaupt gehört?

 von Kalte Sonne

Der Weltklimarat gibt regelmäßig Klimazustandsberichte sowie Spezialberichte zu Sonderthemen heraus, geschrieben von tausenden von Wissenschaftlern. Wer überprüft eigentlich die Richtigkeit der IPCC-Texte? Funktioniert die Qualitätssicherung oder drücken bestimmte Gruppierungen den Berichten ihren persönlichen Stempel auf? Ein neues Video von Sebastian Lüning erläutert das Begutachtungsverfahren der IPCC-Berichte und analysiert Stärken und Schwächen.

Wenn Ihnen das Video gefällt, abonnieren Sie doch den Kanal „Klimawandel Crashkurs“. Hierzu klicken Sie auf „abonnieren“ auf der Youtube-Seite des Clips.

Das Video schließt inhaltlich an diesen vorherigen Clip an:

Deutsche Koordinierungsstelle des IPCC: https://www.de-ipcc.de/226.php
IPCC Procedures: https://www.ipcc.ch/documentation/pro…
Chris Landsea: http://landscapesandcycles.net/chris-…
Richard Tol (1): http://richardtol.blogspot.com/2014/0…
Richard Tol (2): http://nofrakkingconsensus.blogspot.c…
Richard Tol (3): https://voxeu.org/article/regulating-…
Ross McKitrick: https://www.thegwpf.org/images/storie…
Gruppendenken: https://lexikon.stangl.eu/3606/gruppe…
Advocatus Diaboli: https://de.wikipedia.org/wiki/Advocat…
Red Teams: https://de.wikipedia.org/wiki/Red_Team
Buch von Fritz Vahrenholt und Sebastian Lüning: „Unerwünschte Wahrheiten: Was Sie über den Klimawandel wissen sollten“, Langen Müller Verlag, München, 347 Seiten, erschienen im September 2020. Quellenverzeichnis auf https://www.unerwuenschte-wahrheiten.de


Klimaneutral bis 2050: Der Plan vom „grünen Wirtschaftswunder“ hat einen entscheidenden Makel. So titelt die WELT. Daniel Wetzel nimmt sich der Pläne an und stellt fest, dass eine grundsätzliche Machbarkeit noch kein Plan ist. Sogar Klima-Thinktanks pflichten ihm bei.

„Damit trifft die Kritik, die Klimaschutz-Experten jüngst an einer ähnlichen Studie der Fridays-for-Future-Bewegung geäußert hatten, im Prinzip auch auf die neue Agora-Studie zu: Einfach nur die Vervielfachung aller Öko-Technologien zu fordern, ist für sich genommen noch kein Plan. Erneut werden lediglich Ziele gesetzt, ohne konkrete Umsetzungsschritte zu nennen.“… „Das Wuppertal Institut hatte in der Woche zuvor für Fridays-for-Future berechnet, was das noch ambitioniertere Ziel einer Klimaneutralität schon bis 2035 bedeuten würde. Die Generalsekretärin des Mercator Research Instituts on Climate Change (MCC), Brigitte Knopf, hatte die Aussagekraft dieser Berechnungen auf Twitter in Zweifel gezogen: „Es fehlt praktisch komplett eine Analyse der ökonomischen Machbarkeit.“

Den ganzen Artikel in der WELT lesen.


Brasilianische Wälder brennen wieder oder soll man sagen, immer noch? Der SPIEGEL berichtet. Bisher blieb der große öffentlich Aufschrei aus, obwohl die Zahl der Brände ein neues Hoch erreicht hat. Laut Spiegel hat der Präsident Brasiliens die Feuerwehrleute zurückbeordert. Wälder sind wichtige Kohlenstoffsenken, sie zu verlieren ist in mehrfacher Hinsicht schlecht. Der Vorschlag von Professor Hans-Werner Sinn die EU solle den Amazonas kaufen, mutet auf den ersten Blick eigenartig an. Auf den zweiten Blick aber schon nicht mehr so sehr.

Konsequent wäre es zudem, auf das Verbrennen von Holz weltweit zu verzichten, denn  Wälder sind überall wichtige Klimafaktoren. Welche katastrophalen Schäden in den USA durch das Abholzen der Wälder und das anschließende Verbrennen in sogenannten Biomasse-Kraftwerken anrichtet, schildert der Film Burned. Auf ihn weisen wir immer wieder gern hin. Auch bei uns in Deutschland gibt es Bestrebungen, dass Holz die neue Kohle wird. Selbst Studien, wie die kürzlich vorgestellte des Wuppertal-Instituts sehen es vor. Der Nabu in Deutschland hat eine klare Haltung dazu.

Man kann als Verbraucher übrigens auch etwas machen. Wer mit Fleisch aus der näheren Umgebung statt aus Südamerika vorliebnimmt, macht es unattraktiver, dass Wälder für Rinderzucht oder Sojabohnenanbau abgefackelt werden.


Ein kleines Video illustriert wie das dann in gr,oßer Skalierung aussieht. Kann man sich hier ansehen und überlegen, ob das der richtige Weg ist.


Tesla genießt den Ruf eines Heilsbringers. Bislang waren Rückrufe von Autos eher ein Privileg der traditionellen Hersteller. Nun, Elon Musk kann auch nicht über Wasser gehen und Tesla muss gerade in China 30.000 Fahrzeuge zurückrufen. Weiterlesen in der LA Times.


Ist Holz die neue Kohle? In Hamburg wird allen Ernstes geprüft, Buschholz aus Namibia zur Stromerzeugung zu verfeuern. Namibia wiederum importiert Kohle zur eigenen Stromgewinnung. Geht es noch verrückter? Robin Wood protestiert gegen dieses Vorhaben.


Der Spiegel bereitet das Thema Klima multimedial auf. Im dritten Teil der Serie geht es um Wälder, genauer den Verlust von Wäldern. Das sieht ohne Zweifel beeindruckend aus und es ist auch sehr informativ. Aber wie man es schafft, das Thema Verbrennen von Bäumen aus den westlichen Wäldern (auch bekannt als Biomasse) komplett außen vor zu lassen, das verwundert schon. Der Schwerpunkt liegt allein auf der Landwirtschaft in Südamerika und Asien, die am Waldverlust Schuld sind. Kein Wort über Studien wie die von Fridays For Future, die Biomasse (Holz verbrennen) vorsehen oder auch Lobbyisten, die keine Probleme mit Holzverbrennungen daheim haben, gern aber auf jeden Waldbrand dieser Welt zeigen und meinen, dass mehr Windräder in Deutschland sofort für das Erlöschen sorgen. Es ist eine Kunst, auf diesem Auge blind zu sein, oder hat der Spiegel einen toten Winkel?


Peter Unfried geht in einer Kolumne in der TAZ auf die Radikalisierung bei Klimabewegungen ein, denen er wenig abgewinnen kann.

„Falls die Klimapolitikbewegung aber nur sich selbst radikalisiert, wird auch sie im elitären Nirvana routinierter Berufsbesserwisser enden.“


Zweite Folge vom Quaschning Podcast. Studiogast diesmal: Reiner Wahlkampf.


Ein außerordentlich lesenswertes Interview hat der Buchautor Dr. Daniel Stelter mit Prof. Dr. Ing Holger Watter geführt. Watter ist Professor für Systemtechnik an der Fachhochschule in Flensburg. In dem Interview geht er auf viele wissenschaftliche bzw. physikalische Fakten ein, die mit Sicherheit einigen Protagonisten der Energiewende nicht schmecken dürften, die ansonsten sehr gern auf die Wissenschaft zeigen. Stelter nennt diese Menschen Alchimisten der Moderne.

„Die Hauptherausforderung besteht in der gesellschaftlichen Diskussionsfähigkeit, weil breite Schichten der Bevölkerung und ein Großteil der vermeintlichen Experten nicht zwischen “kW” und “kWh” unterscheiden können und die Herausforderungen grob fahrlässig vereinfachen. Dies ermöglicht politische und wirtschaftliche Geschäftsmodelle, die Lobbyinteressen vertreten, Gewinne privatisieren, Risiken sozialisieren und nicht zur Problemlösung beitragen.“


„Der bekannteste und bis vor Kurzem größte Akkuspeicher der Welt ist der Hornsdale Power Reserve in Australien mit einer Kapazität von 194 MWh (für ca. 100 Mio. €). Nehmen wir an, dieser Speicher solle Deutschland durch eine windstille Nacht bringen, bei einer geringen Last von ca. 50 GW – das würde bedeuten 194 MWh/50.000 MW = 3,88 x 10-3 Std. = 14 Sekunden! Schlussfolgerung: Breite Anwendungen finden Akkus nur bei kleinen mobilen Geräten mit wenig Leistungsaufnahme (im Milliwattbereich)…“

Zu diesem wirklich sehr spannenden Interview (es gibt auch einen Podcast) geht es hier.


Die vielen Farben des Wasserstoffs. Jetzt kommt eine weitere Farbe dazu, nämlich weiß. Das ist derjenige Wasserstoff, der auf natürliche Weise entsteht und wie Gas gefördert werden kann. In Europa gibt es Vorkommen in Skandinavien und begrenzt auch in Deutschland. Die Kosten bei der Förderung betragen nur 20% der Kosten, die bei der Elektrolyse anfallen. In der WELT geht Daniel Wetzel auf diesen Wasserstoff ein. Der Artikel steht hinter einer Bezahlschranke.

Der Beitrag Die Begutachtung der Berichte des Weltklimarats: Werden Kritiker überhaupt gehört? erschien zuerst auf Kalte Sonne.


The Curious Incident of the Test that was Negative in the Night Time

Before I start, I must confess that I am no Sherlock Holmes. What is more, my understanding of virology extends no further than is to be expected after having caught influenza more than once. Nevertheless, such experience alone should be sufficient to instil a healthy fear of what SARS-CoV-2 may do to an ailing and aging male body – no matter how sceptical that body may be. But when one witnesses and experiences the civic and economic damage that a government is prepared to inflict upon its people in order to manage a pandemic, the fear can become anything but healthy.

Given such mental health challenges, one certainly would not welcome any further distress arising from the simple desire to understand the case statistics upon which governments are basing their decision-making. Unfortunately, that is exactly the position I am in. There are things I think I know for certain, and there are things that have happened that appear to flatly contradict those certainties. This is all very destabilizing. I’ll start, if I may, with the widely understood certainties, after which you are invited to follow me down the rabbit hole.

Firstly, when interpreting a medical diagnostic test result, one has to take into account the possibility of false negatives (i.e. tests that fail to detect the presence of a disease) and false positives (i.e. tests that record the presence of the disease, notwithstanding its absence). These are respectively referred to as the sensitivity and specificity of the test. RT-PCR testing is no exception to this rule. Indeed, Lancet has advised that the specificity of RT-PCR testing is such that between 0.8% and 4% of positive test results are likely to be false positives. When the a priori probability of the disease is high (for example, when testing those who are presenting symptoms or have been in contact with a confirmed case) the number of false positives will be significantly exceeded by true positives, and so a positive test result is highly significant. However, once testing becomes more random, the a priori probability drops and the false positives start to dominate, to the extent that the test results become pretty meaningless. All of this is all very uncontroversial; it is just standard Bayesian statistics and a reminder of the dangers of base rate neglect. Indeed, the British Medical Journal has produced an online tool that enables anyone to try various a priori probabilities to see how this affects the reliability of RT-PCR test results.

So imagine my surprise when the UK’s Office of National Statistics wrote this about their national COVID-19 Infection Survey:

“We know the specificity of our test must be very close to 100%”

Their logic was impeccable. If, as they claimed, only 159 positive test results were found in a sample of 208,000, then the least that the specificity could be was 99.92% — a full order of magnitude more specific than the most optimistic figure quoted by Lancet. Given the random nature of the ONS testing, and the relatively low prevalence of Covid-19 within the broader community, the specificity suggested by Lancet would have meant encountering far more false positive test results than genuine ones, and it seems more than a little convenient to me that this had not proven to be the case with the ONS survey. Even more puzzling was the apparent lack of curiosity within the scientific and journalistic communities. Rather than question these results, everyone seemed happy to assume that the ONS was using some especially accurate test technology, despite there being nothing on the ONS website to justify such an assumption. On the contrary, the ONS academic partners have confirmed there was nothing out of the ordinary about their testing arrangements:

“The nose and throat swabs are sent to the National Biosample Centre at Milton Keynes. Here, they are tested for SARS-CoV-2 using reverse transcriptase polymerase chain reaction (RT-PCR). This is an accredited test that is part of the national testing programme.”

On the face of it, a team of top-class statisticians were working back from their data to deduce a test specificity that flew in the face of all of the known science regarding RT-PCR testing, and no one seemed the least bit concerned about this.

Normally, in these circumstances, it is safe to assume that one is missing something very significant. It would only require someone to point out my mistake and I would be able to move on, albeit somewhat chastened and embarrassed. I have tried to resolve the mystery myself, but the best I have come up with is the rather outlandish theory that the ONS sample size of 208,000 was completely misleading. If (let’s say, due to quality control problems) the effective number was nearer to 50,000, then the small number of positive results can still be reconciled with the expected Covid-19 prevalence and a more plausible RT-PCR specificity. But other than to point to the fact that survey participants from 12 years old upwards were allowed to self-administer the swabs, I could think of no credible excuse for assuming that such a catastrophic failure in quality control had taken place. I had no alternative but to live with the prima facie contradiction and get on with life. But then I came across the New Zealand Ministry of Health’s Covid-19 statistics.

If New Zealand is to be believed, by early May, only 25 of its 1,138 Covid-19 cases had been asymptomatic. That represents only 2.2% of the cases, and it contrasts sharply with the statistics arising in other countries (e.g. 40% in US nursing homes and 90% in Northumbria University). Just as problematic is the fact that the New Zealand figures were determined as a result of extensive community testing, i.e. circumstances where false positives would be certain to dominate the asymptomatic Covid-19 headcount, and single-handedly account for far more than 25 individuals. Not only does New Zealand owe the world an explanation for its low asymptomatic count, it also needs to explain how, like the UK’s ONS, they were able to achieve near 100% specificity with RT-PCR testing. Furthermore, there is this online statement to be accounted for:

 “When tests were done on samples without the virus, the tests correctly gave a negative result 96% of the time.”

This is a far from impressive specificity, and one which should result in a significant false positive problem for the NZ Ministry of Health to deal with. And yet, only a couple of paragraphs later they say:

“We expect very few (if any) false positive test results…”

And yet, despite this completely illogical expectation, they are proven correct? This is beginning to make the ONS conundrum look perfectly straightforward in comparison.

I trust that you can now see why I should be left so utterly confused. Two organisations that we should presume to be above reproach are making statements that just do not add up. It is no wonder that I am beginning to doubt my own rationality and powers of comprehension. I am hugely sceptical regarding the ONS and New Zealand figures but I feel obliged to be simultaneously sceptical of my own scepticism. Sir Arthur Conan Doyle famously believed in fairies, so I ought to feel in good company. However, I can’t help but suspect that entertaining such cognitive dissonance for any length of time is the sure path to madness. If someone doesn’t rush to my rescue soon and point out where I am going wrong I may end up in an institution listening to the sceptical voices in my head.

Oh yes I will.

Posted on 27 Oct 20 by JOHN RIDGWAY





NWS Kansas meteorologists warn of a “widespread killing freeze” after unprecedented October cold and snow laid waste to the record books.

As reported by kansas.com, the National Weather Service (NWS) in Wichita issued a winter weather advisory on Sunday running through 1 a.m. Tuesday for central, south-central and southeast Kansas. The forecast called for snow, sleet and freezing rain: “Plan on slippery road conditions,” reads the advisory. “The hazardous conditions will impact the morning and evening commutes.”

The city wasn’t able pre-treat its roads with salt on Sunday due to wet conditions, but efforts belatedly began in the early hours of Monday: “We did activate our full response as of midnight,” Ben Nelson, a city public works administrator, said Monday morning. “Once our crews got on scene, we deployed all 60 of our trucks and began to apply the salt and the sand mix across all 1,500 lane miles of arterial (roads) and the 300 lane miles of our secondary and school routes,” Nelson said.

The flakes started falling early Monday morning, as forecast — however, that original NWS advisory vastly underestimated the volume. The snow continued throughout the morning, to levels far greater than city crews had expected.

So much snow fell that city workers needed to use the plows on the front of the dump trucks to clear the roads, something crews try to avoid because 1) it significantly slows down the trucks, and 2) it runs the risk of scraping the already applied salt off the road.



After initially forecasting just a trace, NWS “officially” measured 1.3 inches of snow as of 10:50 a.m. Monday–although the scene on the ground looked far worse in places. Still, that official reading of 1.3 inches almost tripled the previous Oct 26 record of 0.5 inches set way back in 1913 (solar minimum of cycle 14).

Monday’s snow also set another, even more impressive record. According an NWS tweet, Monday witnessed “the most snow Wichita has ever received this early in the season.”


This beat-out the previous earliest 1+inch of snow, set on Oct 28, 1905:

Note Wichita’s average date for first 1+inch of snow: Dec, 19th!

Record cold accompanied the record snow, further hampering city clearing efforts. Monday morning’s low of 24F broke the Wichita record for coldest ever low for the date — the old mark being the 25F set in 1957.


The city also broke its lowest-max for Oct 26, busting the 32F, also set in 1957 — though this record has yet to be officially logged.

Looking forward, the NWS Wichita hazardous weather outlook predicts “a widespread killing freeze” Monday night, to be followed by a wintry mix of precipitation across much of the area on Tuesday, continuing into early Wednesday morning.

Additional snow and ice accumulations are possible through Wednesday afternoon, and as kansas.com points out: “Any measurable snowfall on Tuesday in Wichita would set a record, as the weather service has never recorded snow accumulations on Oct 27. The record low temperature of 23 degrees, set in 1957, and the coolest high of 37 degrees, set in 1911, are both in jeopardy.”

The COLD TIMES are returning, the mid-latitudes are REFREEZING in line with historically low solar activitycloud-nucleating Cosmic Rays, and a meridional jet stream flow.

Both NOAA and NASA appear to agree, if you read between the lines, with NOAA saying we’re entering a ‘full-blown’ Grand Solar Minimum in the late-2020s, and NASA seeing this upcoming solar cycle (25) as “the weakest of the past 200 years”, with the agency correlating previous solar shutdowns to prolonged periods of global cooling here.

Furthermore, we can’t ignore the slew of new scientific papers stating the immense impact The Beaufort Gyre could have on the Gulf Stream, and therefore the climate overall.

Prepare accordingly— learn the facts, relocate if need be, and grow your own.

Social Media channels are restricting Electroverse’s reach: Twitter are purging followers while Facebook are labeling posts as “false” and have slapped-on crippling page restrictions.

Be sure to subscribe to receive new post notifications by email (the box is located in the sidebar >>> or scroll down if on mobile).

And/or become a Patron, by clicking here: patreon.com/join/electroverse, and/or consider “allowing ads” for http://www.electroverse.net if you use a blocker.

The site receives ZERO funding, and never has. So any way you can, help us spread the message so others can survive and thrive in the coming times.

Grand Solar Minimum + Pole Shift

The post “Killing Freeze” hits Wichita, Kansas as all-time Cold and Snow Records Fall appeared first on Electroverse.

It’s The Poor Who Will Pay

By Paul Homewood

A couple of thoughts!

We have talked about air source heat pumps, and we all understand that the colder the weather, the less efficient and more costly they become to operate.

What this means, of course, is that the further north you live, the more you will be penalised financially.

Now combine this thought with electric cars.

Last year our local shopping centre installed half a dozen electric car chargers. (Incidentally, they also had to build a new substation to feed them – I’ve no idea what rating it has, but physically the building that houses it is the same size as the original substation which feeds the whole centre).

I had a closer look yesterday, and the charge is 35p/KWh, which compared to a domestic rate of about 14p. Obviously the mark up is perfectly fair, as whoever runs it needs to recover their capital costs and overheads.

According to Nissan, the Leaf’s 62 KWh battery gives a range of up to 239 miles. At 35p/KWh, this equates to 9.1p/mile. At the domestic rate, this comes down to 3.6p/mile.

If you do 10,000 miles a year, public charging will cost you £550 extra a year. And that is what you might end up having to do if you are not lucky enough to have off street parking.

A comparable diesel car, the Ford Focus, gets up to 55 mpg, which at current prices would cost 9.8p/mile. But more importantly, when you exclude fuel duty of 57.95p/litre, this cost drops to 5p/mile.

So not only would the poor sucker without a driveway be much worse off than the rich guy with the big house down the road, he would even be much worse off than he was before, when he driving his diesel.

And if he happens to live up north in Newcastle, he had better take out a bank loan!



November 24, 2020 at 05:51AM

Ein neues Sedimentarchiv für die historische Klimaforschung

TU Graz, Andre Baldermann, Inst. f. Geowissenschaften

 von Kalte Sonne

Pressemitteilung der Technischen Universität Graz vom 21.10.2020:

TU Graz-Forschende entdecken neues Sedimentarchiv für die historische Klimaforschung

Geologische Untersuchungen von niedrigtemperierten jungen Ablagerungen am Steirischen Erzberg liefern der Paläoklimatologie neue Daten zur Erdgeschichte und deren Entwicklung.

Wie hat sich das Klima im Laufe der Erdgeschichte verändert? Welche klimatischen Prozesse haben die Erde und ihre Atmosphäre beeinflusst? Die Paläoklimatologie sucht Antworten auf solche Fragen, um Klimaveränderungen besser zu verstehen und Prognosen für zukünftige Klimaszenarien ableiten zu können. Als Grundlage dienen dabei sogenannte sedimentäre Archive: Gesteinsablagerungen, deren Bestandteile und Beschaffenheit Auskunft darüber geben, welche Temperaturen und Klimabedingungen zum Zeitpunkt ihrer Bildung herrschten. Über die Klimaentwicklung in der jungen Erdgeschichte ab der letzten Eiszeit vor 20.000 Jahren geben entsprechend junge geologische Ablagerungen Aufschluss. Im Vergleich zu weit verbreiteten Meerwasserablagerungen sind sedimentäre Archive auf dem Festland – wie zum Beispiel im Alpenraum – jedoch sehr selten.

Neue Daten für die Paläoklimaforschung

Einem internationalen Konsortium unter Leitung des Instituts für Angewandte Geowissenschaften (IAG) der TU Graz ist hier nun eine sensationelle Entdeckung geglückt. In einer Publikation für Communications Earth and Environment präsentiert die Gruppe neu entdeckte, geologisch sehr junge Ablagerungen am Steirischen Erzberg, deren Bedeutung als sedimentäres Archiv für die Paläoklimaforschung erstmals untersucht wurde. „Dass wir nun in einem kontinentalen Sedimentarchiv derart junge geologische Ablagerungen gefunden haben, wie sie sonst nur in marinen Sedimentarchiven zu finden sind, ist sensationell und eine Datenfundgrube für die Klimaforschung“, erklärt der Erstautor der Studie Andre Baldermann vom IAG.

Niedrige Bildungstemperatur und junges Ablagerungsalter

Konkret handelt es sich um sedimentäre Verfüllungen von Störungen und Klüften, die aus den karbonatischen Mineralien Dolomit, Aragonit und Kalzit bestehen. Es ist bekannt, dass sich das Karbonat-Mineral Dolomit beim Verdunsten von Meerwasser auskristallisiert, was wiederum hohe Temperaturen voraussetzt. Baldermann und sein Team konnten nun erstmals zeigen, dass das Mineral sich aber auch schon bei Temperaturen zwischen null und zwanzig Grad bilden kann – dazu gab es bislang keine Absolutdaten.
Darüber hinaus stellten die Forschenden fest, dass es sich um vergleichsweise geologisch junge Mineralien handelt, die kurz nach der letzten Eiszeit vor ca. 20.000 Jahren in einem nicht-marinen (kontinentalen) Ablagerungsraum entstanden sind. Baldermann: „Dies stellt eine Neuheit dar, da junge Bildungen des Minerals bis dato fast ausschließlich an Meerwasserablagerungen gebunden waren.“

Materialanalyse durch Multimethodenansatz

Bei den Analysen kam die gesamte Klaviatur der geologischen Untersuchungsmethoden zum Einsatz. Die Gesteinsproben wurden mikroskopisch beschrieben und systematisch klassifiziert. Die Bestimmung der mineralogischen Zusammensetzung erfolgte mittels Röntgendiffraktometrie und die chemischen Eigenschaften wurden mithilfe hochauflösender Elektronenmikroskopie definiert. Für die Altersdatierungen und Temperaturrekonstruktionen wurden die Proben durch hochmoderne Massenspektrometrie elementar und isotopisch analysiert.
„Die Vielzahl an Ergebnissen erlaubte es uns, Rückschlüsse über die Wasserführung, die Wasserzusammensetzung, das Mineralwachstum und die Bildungstemperaturen zu ziehen“, so Baldermann.

Nutzen für die Klimaforschung

„Die Klimaforschung funktioniert größtenteils über die Analyse von Meeresablagerungen, weil wir hier sehr viele Sedimente (marine Sedimente, Anm.) über den gesamten Verlauf der Erdgeschichte archiviert haben. Kontinentale Sedimentarchive sind rar und werden auch nur sehr selten berücksichtigt. Ihre Ablagerungen liefern zumeist nur wenige Informationen über die alten Umweltbedingungen“, so Baldermann. Er ist davon überzeugt, dass die neuen publizierten Daten über die Ablagerungen am Erzberg hier Abhilfe schaffen und neue Perspektiven zur Klimaentwicklung der jüngeren Vergangenheit geben.

Diese Forschung ist im Field of Expertise „Advanced Materials Science“ verankert, einem von fünf strategischen Schwerpunktfeldern der TU Graz.

Die Arbeiten wurden unterstützt durch das NAWI Graz Central Lab Water, Minerals and Rocks, durch das Bergbauunternehmen VA Erzberg sowie durch Forschende der Universitäten Wien und Graz. Finanzielle Mittel wurden aus dem Europäischen Fonds für regionale Entwicklung (EFRE) zur Verfügung gestellt.

Paper: Fracture dolomite as an archive of continental palaeo-environmental conditions
Andre Baldermann, Florian Mittermayr, Stefano M. Bernasconi, Martin Dietzel, Cyrill Grengg, Dorothee Hippler, Tobias Kluge, Albrecht Leis, Ke Lin, Xianfeng Wang, Andrea Zünterl, Ronny Boch
Nature Communications Earth and Environment
Doi: 10.1038/s43247-020-00040-3

Der Beitrag Ein neues Sedimentarchiv für die historische Klimaforschung erschien zuerst auf Kalte Sonne.

Ein neues Sedimentarchiv für die historische Klimaforschung – Kalte Sonne

via Kalte Sonne

Schlechte Klima-Bilanz für Holzpellets

 von Kalte Sonne

Der SPIEGEL ist in Sachen Pelletheizung leider auf dem Holzweg. Der entsprechende Artikel hat eine Bezahlschranke, was die Falschinformation aber nicht besser macht, nur teurer. Heizen mit Holz ist nicht klimaneutral. Hoffentlich begegnen sich der Redakteur und der dauerempörte Christian Stöcker niemals auf den Fluren….


Bayerischer Rundfunk am 6.11.2020:

Schlechte Klima-Bilanz für Holzpellets

Das Heizen mit Holzpellets gilt als umweltfreundlich und klimaneutral. Die kleinen Sägemehl-Stäbchen werden als nachwachsender Rohstoff sogar vom Bund gefördert. Europäische Forscher stellen Holzpellets jetzt aber ein schlechtes Klima-Zeugnis aus.

Die Idee ist einfach: Wenn man Holz verbrennt, wird nur genau so viel CO2 frei, wie der Baum zuvor für sein Wachstum in den letzten Jahrzehnten aus der Atmosphäre geholt hat. Anders als bei fossilen Brennstoffen wie Kohle gerät also kein zusätzliches CO2 in die Atmosphäre, das den Klimawandel beschleunigt. Deshalb gelten Holzpellets als regenerativer Brennstoff und werden gefördert. Eine Folge: Der Holzverbrauch in Deutschland hat sich nach Angaben des Zentrums Holzwirtschaft der Universität Hamburg seit Beginn der 1990er Jahre verdoppelt. 50 Prozent des Holzaufkommens werden bei uns mittlerweile energetisch genutzt.

Weiterlesen auf br.de.


Spektrum.de 2016:

Wie Holzpellets die Wälder ruinieren

Rund die Hälfte der erneuerbaren Energie in Europa wird mit Hilfe einer archaischen Technologie gewonnen: dem Verbrennen von Holz. Scheitholz, Pellets und Hackschnitzel ersetzen – oftmals im Namen des Klimaschutzes – Gas- und Ölfeuerungen. Aber ist diese Rückbesinnung auf den Energieträger Holz wirklich nachhaltig und umweltschonend?

WEiterlesen auf Spektrum.de


Planet e, eine Umweltserie des ZDF, hat diesmal das Thema: Wieviel Energiewende verträgt Deutschland? Wenig Licht und leider sehr viel Schatten liegt über der Sendung. Loben sollte man, dass immerhin eine kritische Stimme gehört wurde, es ist die Anna Veronika Wendland, die auf das Dilemma aufmerksam macht, dass Atomkraftwerke demnächst durch fossile Kraftwerke ersetzt werden (müssen), weil die grünen Stromquellen unstet sind und Deutschland sich gleichzeitig aus der Kohle verabschiedet. Ansonsten werden ausschließlich Verfechter der grünen Energien befragt.

Ob der unvermeidliche Aktivist Quaschning, der Windkraftunternehmer Lachmann, der Direktor von Agora Graichen, ausschließlich die Fraktion der bedingungslosen Befürworter kommt zum Zuge. Außer Wendland kommt keine einzige kritische Stimme zu Wort. Auf die Idee auch nur einen der Interview-Partner die Frage zu stellen, wie in Zeiten von wenig Wind und keiner Sonne die Versorgung mit Strom sichergestellt werden soll, kam die Autorin der Sendung offenbar nicht. Dabei wäre das die spannende Frage gewesen, die Antworten der Protagonisten allemal.

Auch die Tatsache, dass sich Anlagen nach dem Abgreifen von üppigen Subventionen nicht mehr rechnen und die Betreiber die Marktwirtschaft mit freier Preisbildung lieber meiden, wird nicht wirklich thematisiert. Eher wird der Regulierung der schwarze Peter in die Schuhe geschoben und die Bürokratie beweint. Die Bürokratie wird allerdings gefeiert, wenn es darum geht, dass Häuslebauer in Baden-Württemberg verpflichtet werden, eine Solaranlage auf das Dach montieren müssen.

Interessant ist sicherlich das Portrait der Dresdner Firma Sunfire und welche Probleme dieses Unternehmen mit dem EEG hat. Die Technik der Sachsen ist spannend, weil sie offenbar weniger Wirkungsgradverluste bei der Erzeugung von Wasserstoff oder Methan aufweist, die hohen Stromkosten sind allerdings der Bremsklotz. Nur sind die Kosten auch deshalb so hoch, weil die sogenannten Erneuerbaren Energie stark subventioniert werden und Deutschland sich quasi zwei Infrastrukturen leistet. Ein klassischer Zirkelschluss.

Nun sind 30 Minuten eine kurze Zeit, um ein komplexes Thema umfänglich darzustellen, etwas mehr Ausgewogenheit hätte der Sendung aber gutgetan. Dann wäre mit Wahnsinn in den Zwischenüberschriften nicht ausschließlich gemeint, dass wir nicht längst weit mehr als die 30.000 Windkraftanlage im Lande stehen haben, damit Unternehmer wie Lachmann und Aktivisten wie Quaschning zufrieden sind. Die Sendung ist noch bis 22.11.2021 abrufbar.


Wie wollen wir in Zukunft leben? Nach Ansicht des linken Aktivisten Mario Sixtus ist es ganz einfach. Alle Menschen ziehen nach Berlin Friedrichhain, Hamburg Ottensen oder Köln Kalk. Sind das schon zarte Andeutungen von Zwangsumsiedlungen, sollten einmal Politiker aus der ideologischen Nähe von Sixtus ans Ruder kommen? So etwas ist bisher eher aus der ehemaligen UdSSR oder China bekannt. Auweiha


Die Angst vor dem Blackout. So titelt die WELT und spricht das mögliche Ende des relativ modernen Kohlekraftwerks Moorburg an.

„Hamburgs Wirtschaftssenator Michael Westhagemann (parteilos) teilt die Sorge der Wirtschaft. „Wir müssen den Standort Hamburg bei der Energieversorgung wetterfest machen“, sagte er WELT AM SONNTAG. „Das ist nicht nur eine Frage der Versorgungssicherheit, sondern auch der laufenden Kosten vor allem der großen Stromverbraucher in der Wirtschaft.“

Gänzlich unterschiedlich schätzen der Wirtschaftssenator und der Umweltsenator der Hansestadt die Situation ein. Während der Grüne Umweltsenator Jens Kerstan von einem Überangebot an Strom ausgeht, scheint es dem Wirtschaftssenator Westhagemann um das 380.000 Volt Netz zu gehen, weil bestimmte Industrien solchen Strom benötigen. Nach einer Schließung von Moorburg gibt es kein Kraftwerk mehr in der Nähe Hamburgs, das solchen Strom produziert. Zum ganzen Artikel geht es hier.


Über Wasserstoff schreibt und sammelt Alfred Brandenberger in seinem Internet-Vademecum.



Plusminus (ARD – 07.10.2020 – 21:45) sendete einen Bericht zu notleidenden Windkraftbetreibern. Aus dem EEG fallende Anlagen sind ohne weiter Subventionen nicht mehr rentabel weiter zu betreiben, obwohl doch die Stromerzeugung aus Windkraft eine der günstigsten ist. Auf diese jetzt entstehende Situation haben sich aber alle Investoren und Betreiber vor 20 Jahren bewusst eingelassen. Daher ist es eigentlich unredlich, in eine Opferrolle zu schlüpfen und weiter um Geldunterstützung für ein Produkt zu betteln, welches ohne dies am Markt alleine nicht existieren kann. Weiter wird der Ruf nach Repowering laut, was aber aus Genehmigungsgründen nicht überall gegeben ist. Die Politik soll die Rahmenbedingungen schaffen, weil sonst angeblich der Klimawandel nicht gestoppt werden kann. Alternativ müssen die WKA quasi abgerissen und aufwendig entsorgt werden. Das wird teuer und belastet außerdem die Umwelt. Auch das war vor 20 Jahren bekannt, wurde aber anscheinend total verdrängt. Ganz schlimm war im Bericht die Darstellung, dass das Vogelschreddern hingenommen werden soll mit dem Argument, dass Klimaschutz gleich auch Naturschutz ist. Die herangezogenen Beispiele dazu sind einfach nur perfide.

Das Ziel (dena-Studie), 2050 über 210.000 WKA zu verfügen, ist nicht nur wegen der fehlenden Akzeptanz durch die Bevölkerung eine große „Herausforderung“, sondern auch durch die Notwendigkeit, dafür an jedem Tag der nächsten 30 Jahre 16 neue Windräder in Betrieb nehmen zu müssen. Das kostet täglich mind. 50 Mio €. Es sind aber nicht genug Flächen zur Realisierung am Markt. Also müssten zur Umsetzung erst einmal per Gerichtsbeschluss unzählige Flächen von Privat enteignet werden. Für den Neubau eines Windrades ist eine Baugenehmigung nebst Erstellung von Gutachten für Boden-, Windintensität und Geräuschentwicklung, so wie Berücksichtigung von Abstandsflächen und Höhenbegrenzungen erforderlich. Ein bürokratischer Aufwand ohne Gleichen. Kurze Anmerkung: in den ersten 6 Monaten des Jahres 2020 erfolgte der Zubau von 245 Onshoreanlagen = 781 MW und 32 Offshoreanlagen = 218,9 MW. Um das Ziel von 210.000 WKA bis 2050 zu erreichen, hätten aber allein in diesem ersten Halbjahr 3.000 Anlagen zugebaut werden müssen. Hinzu kommt, dass Anfang 2021 = 6.000 Windturbinen in Deutschland wegen auslaufender Förderung aus dem EEG fallen. In den Folgejahren sind es jährlich 1000 bis 2000 die ersetzt werden müssen. Da wird der Wahnsinn einer subventionierten Planwirtschaft offenkundig. Stefan Aust schrieb dazu am 25.01.2020 in „Welt“ einen treffenden Bericht: Luftreich der Träume.

Erschreckend bei alledem ist jedoch, dass weder Politiker noch Anlagenbetreiber die Bürger über die zu erwartenden Schwierigkeiten aufklärt. Alle Ausführungen werden mit allgemeinen Floskeln unkonkret und nebulös umschrieben. Ein Schelm, der böses dabei denkt. Eigentlich haben wir Bürger ein Recht auf offene und ehrliche Aufklärung.


Sensation: Neuartige Ladestation liefert Gratisstrom ganz ohne Kraftwerke

Lustiges Video hier schauen.


Focus am 16.11.2020:

Fast 7 Prozent garantiertStromnetzausbau sündhaft teuer – doch Betreibern winken traumhafte Renditen

Wer hätte nicht gerne 6,91 Prozent garantierte Rendite für seinen Kapitaleinsatz? Tja, ein großer Stromnetz-Betreiber müsste man halt sein. Auch die vielen kleinen regionalen Netzbetreiber profitieren von den staatlich zementierten Gewinnmargen. FOCUS Online zeigt, wer die Rechnung zahlt.

Weiterlesen im Focus


DGS am 6.11.2020:

Ja wo wird er denn erzeugt? Meine Rundreise zum nicht immer grünen Wasserstoff

Ein Erlebnisbericht von Heinz Wraneschitz

Wasserstoff, immer wieder Wasserstoff. Gleich zweimal war ich in dieser Woche bei Online-Veranstaltungen dabei, die das chemische Element mit der Ordnungszahl 1 zum Inhalt hatten. Eine neue Studie habe ich mir auch noch angeschaut. Und ständig hatte ich das Gefühl, ich bin im falschen Film.

Ja, natürlich ist Wasserstoff (H2) ein Super-Energieträger. Denn wenn er katalytisch oder „normal“ verbrennt, wird kein CO2 freigesetzt wie bei Benzin, Erdgas, Kohle: Aus 2H2 plus O(Sauerstoff) wird H2O (Wasser) plus Energie. Doch woher den Wasserstoff nehmen? Anders als besagte Fossilien kommt Hin der Natur normalerweise nicht vor; man muss ihn also erst einmal herstellen. Und zwar genau in der Rückwärtsrichtung: Aus H2O wird unter Zuführung von Energie (z.B. aus Strom) 2H2 plus O2Elektrolyse nennt man das kennt jeder Mensch aus dem Physikunterricht, 4. Klasse oder so. Alternativ wird H2 auch schon mal bei Chemie-Prozessen als Nebenprodukt frei. Aber Energie muss auch dafür aufgewendet werden.

Strom haben wir eigentlich hierzulande noch genug. Auch um H2 zu erzeugen. Doch damit H2 auch wirklich „Grün“ ist, muss der Strom ebenfalls diese Farbe haben. Gut, inzwischen stammt die Hälfte der hiesigen Strommenge aus Sonne, Wind, Wasser, Fäkalien. Aber Überschuss? Den könnten wir aktuell höchstens dann abgreifen, wenn die Grundlast aus – nein, es heißt, sie seien nicht dreckig – Kohlekraftwerken die Übertragungsnetze von Nord nach Süd verstopfen und so die Windkraftwerke im Norden und Osten der Republik www.eihrer Einspeisemöglichkeiten berauben.

Weiterlesen auf dgs.de

Der Beitrag Schlechte Klima-Bilanz für Holzpellets erschien zuerst auf Kalte Sonne.

Schlechte Klima-Bilanz für Holzpellets – Kalte Sonne

via Kalte Sonne

Teil 2 jetzt online: 10 Tipps für den Kampf gegen das CO2

 von Kalte Sonne

Im 2. Teil der Serie „10 Tipps für den Kampf gegen das CO2“ geht es um diese Themen:

4. Dämme Deine Fleischeslust ein
5. Pullover und Winterjacke statt Heizung
6. Schluss mit dem Sauberkeitswahn
7. Klickscham: Klimakiller Internet

Hier das Video:

m 1. Teil ging es um:

1. Das Auto muss weg
2. Flugscham: Verzichte auf Reisen in ferne Länder
3. Klimasünde Kleidung

Wer den 1. Teil noch nicht kennt, hier das Video:

Der Beitrag Teil 2 jetzt online: 10 Tipps für den Kampf gegen das CO2 erschien zuerst auf Kalte Sonne.

Teil 2 jetzt online: 10 Tipps für den Kampf gegen das CO2 – Kalte Sonne

via Kalte Sonne



Last week, videos began circling on YouTube containing the views of an expert and perfectly positioned pathologist re COVID-19. The views were expressed in a 5 minute audio recording of a recent Edmonton City Council Community and Public Services Committee meeting in Alberta, Canada.

During the recording, top pathologist Dr. Roger Hodkinson calls the latest COVID strain currently sweeping the globe as “the greatest hoax ever perpetrated on an unsuspecting public”.

YouTube have since taken a dim view on Dr. Hodkinson’s stance, in which he also says COVID-19 “is nothing more than a bad flu season,” and in turn the platform is now systematically deleting each every video containing the eminent doctors opinion — they don’t want people to hear what he has to say.

But it’s even worse than that.

YouTube, in their lofty and self-appointed position as arbiters of truth, are not only deleting each and every video –some of which had gained close to a million views– but they are “terminating” each and every account that uploaded them.

At least one of the “terminated” accounts has since started a new, and has been quick to re-upload the Dr. Hodkinson audio. The account owner (GN4GN) has this to say on the censorship: “YouTube is blocking and striking my videos, already deleted my other channel that I had for six years.”

The likes of YouTube don’t want us forming our opinions based on a wide range of views. They instead want us channeled down one very particular path and believing their version of reality. Our fear and compliance is paramount to them “winning,” and I’m afraid they played the game perfectly–or rather, society lost it perfectly, aided in no small part by the mainstream media and their 24/7 slew of fear-porn.

Snopes Fact-Check

Snopes claims to be fighting an “infodemic” of rumors and misinformation. But as with every other “fact-checker” out there, they’re merely tasked with maintaining the status quo along and promoting the mainstream consensus.

Regarding Dr. Hodkinson and his recently expressed views, the best Snopes can do is attack his listed credentials, namely Dr. Hodkinson being the Chairman of the Royal College of Physicians and Surgeons of Canada.

Snopes writes, correctly, that this is false, that the doctor has never held this position–but the video and all subsequent articles that I’ve read never claimed Hodkinson claimed that he did. His accredited position is “Chairman of a Royal College of Physicians and Surgeons committee in Ottawa” — this is in fact the opening line of my original article on the topic, and is listed on Dr. Hodkinson’s company website.

Embarrassingly, this is the best Snopes can do. They rate the Dr. Hodkinson audio as “Mixture” or “Misleading” but fail to debunk any of his points (other than the aforementioned credential mix-up). Snopes accepts the doc’s 6 other professional qualifications and achievements, and then go on to baselessly attack his views, but again: 1) Dr. Hodkinson is perfectly positioned to comment on this topic, 2) who fact-checks the fact-checkers? and 3) all Snopes proceed to do is cite the mainstream consensus, repeating the fearmongering dross that COVID-19 is the worst thing to plague humanity since, well, the plague, and that we should all be on our knees, bare arm outstretched begging Big Pharma for a shot.

Also worth noting are the stark similarities that both COVID-19 and its EOTW forerunner “Climate Change” share: each are painted as unmitigated catastrophes, but catastrophes we can be be saved from if only we relinquish our freedoms and alter every aspect of the way we live — The Great Reset.

The Facts

There is no evidence that lockdowns work: see the Great Barrington Declaration.

There is no evidence that masks work: “There is no evidence base for their effectiveness whatsoever,” says Dr. Hodkinson. “[Masks] are simply virtue signalling. Seeing these people walking around like lemmings obeying, without any knowledge base, to put the mask on their face.”

There is no evidence that social distancing works: “COVID is spread by aerosols, which travel 30 meters-or-so before landing.”

There isn’t even any evidence which shows testing works: “Positive test results do not mean a clinical infection,” says Dr. Hodkinson, who is Chairman of a Medical Biotechnology company selling a COVID-19 test. “All testing should stop, unless you’re presenting to hospital with some respiratory problem. It’s driving public hysteria, and all testing should stop. All that should be done is to protect the vulnerable.”

The stats are now widely available for anyone to Google: COVID-19 is the seasonal flu. It has a mortality rate similar to the seasonal flu (even with the massaged figures). The age-group it most impacts is the same as the season flu. And so the response, if we were to contract it, should be the same as during every other year that a person contracted the seasonal flu: “We stayed home, we took chicken noodle soup, we didn’t visit granny, wedecided when we would return to work, we didn’t need anyone to tell us.”

I will not hand over my freedoms for the seasonal flu.

In fact, even within the bowls and darkest depths of a genuine pandemic, no government should have the right to destroy the livelihoods of its citizens, or to lock people in their homes — this is an abuse of power, absolutely.

And what’s next?

People have so dutifully relinquished their freedoms, most without batting an eyelid. Many are even now fighting on behalf of the restrictors. Society has taken ten steps back in 2020, and my biggest concern is we won’t regain nearly half of them during 2021, and beyond…

“Let people make their own decisions,” concludes Dr. Hodkinson.

“I’m absolutely outraged that this has reached this level. It should all stop tomorrow.”

Dr. Roger Hodkinson on COVID: “This is the Biggest Hoax ever perpetrated on an Unsuspecting Public”


Social Media channels are restricting Electroverse’s reach: Twitter are purging followers while Facebook are labeling posts as “false” and have slapped-on crippling page restrictions.

Be sure to subscribe to receive new post notifications by email (the box is located in the sidebar >>> or scroll down if on mobile).

And/or become a Patron, by clicking here: patreon.com/join/electroverse, and/or consider “allowing ads” for http://www.electroverse.net if you use a blocker.

The site receives ZERO funding, and never has. So any way you can, help us spread the message so others can survive and thrive in the coming times.

Grand Solar Minimum + Pole Shift

Facebook and Climate Feedback: Assisting the Great Reset


The post YouTube Deletes Dr. Hodkinson Audio in Telling act of Censorship appeared first on Electroverse.

YouTube Deletes Dr. Hodkinson Audio in Telling act of Censorship – Electroverse

via  Electroverse