Expert: Compulsory Face Masks In Public Is ‘Very Dangerous’

Sweden‘s top coronavirus expert is refusing to force people to wear face masks in public, arguing that donning them is ‘very dangerous’ because it gives the impression it is safe to be in crowded rooms or on public transport.

Anders Tegnell, chief epidemiologist at Sweden’s Public Health Agency, has expressed scepticism that face masks will control Covid-19 outbreaks.

The infectious diseases expert, who refused to follow European governments in locking down in March, also noted that countries with widespread mask compliance, such as Belgium and Spain, were still experiencing rising cases of Covid.

In an interview with the Financial Times, Dr Tegnell said: ‘It is very dangerous to believe face masks would change the game when it comes to Covid-19.

‘Face masks can be a complement to other things when other things are safely in place. But to start with having face masks and then think you can crowd your buses or your shopping malls – that’s definitely a mistake.’

Dr Tegnell previously brushed off the prospect of compelling Swedes to wear face masks, and called evidence of their effectiveness ‘astonishingly weak’.

Sweden, which has stood out among European countries for its low-key approach to fighting the coronavirus pandemic, recorded its highest tally of deaths in the first half of 2020 for 150 years, the Statistics Office said on Wednesday.

Covid-19 claimed about 4,500 lives in the period to the end of June – a number which has now risen to 5,800 – a much higher percentage of the population than in other Nordic nations, though lower than in some others including Britain and Spain.

In total, 51,405 Swedes died in the January to June period, a higher number than any year since 1869 when 55,431 died, partly as a result of a famine. The population of Sweden was around 4.1 million then, compared to 10.3 million now.

Covid-19 meant that deaths were 10 percent higher than the average for the period over the last five years, the Statistics Office said. In April the number of deaths was almost 40% higher than average due to a surge in Covid-related fatalities.

Sweden has taken a different approach to most European countries in dealing with the pandemic, relying to a greater extent on voluntary measures focused on social distancing and opting against a strict lockdown.

Most schools have remained open and many businesses have been continued to operate to some extent, meaning the economy has fared better than many others.

However, the death toll has been higher than in its Nordic neighbours, which opted for tougher lockdown measures. Norway, with around half the population, has had only around 260 Covid deaths in total.

Finland’s economy outperformed its larger neighbour in the second quarter, despite a tougher lockdown. Finland’s gross domestic product shrank around 5 per cent against an 8.6 per cent contraction in Sweden from the last three-month period.

Last month Dr Tegnell’s public health agency shrugged off claims that people should wear face masks in crowded public spaces during the pandemic.

Speaking to German newspaper Bild, the coronavirus expert described ‘the belief that masks can solve our problem’ as ‘very dangerous’.

‘The findings that have been produced through face masks are astonishingly weak, even though so many people around the world wear them,’ he said.

‘I’m surprised that we don’t have more or better studies showing what effect masks actually have. Countries such as Spain and Belgium have made their populations wear masks but their infection numbers have still risen.


Published on August 22, 2020

Written by Jack Wright

Fake Media Blaming Heatwave On Climate Change

Dozens of prominent media outlets have published stories in the last few days claiming climate change is causing wildfires and a deadly heatwave. Objective data, however, show there has been no increase in either one as the Earth modestly warms.

A story on the Colorado Public Radio website, titled “Colorado Wildfires Are Climate Change ‘In The Here And Now’ — And A Sign Of Summers To Come,” explicitly links wildfires in the state to human-caused climate change, warning of worse to come.

The Washington Post (“Why California Wildfires are So Extreme Right Now,”) and Capitol Public Radio News (CPRN), “As Californians Deal With Heat, Lightning, Fire, Scientists Point To Climate Change,” also claim a heatwave and wildfires in California right now are caused by climate change.

“The heatwave, the fires and weather patterns are in part related to climate change, says UCLA climate scientist Daniel Swain, because warming temperatures are ‘with great certainty’ increasing these conditions,” CPRN’s reporter writes.

“This whole event started as a record-breaking heatwave … and we also know that climate change is increasing the severity and the acres burned by wildfires in California,” Swain told CPRN.

The CPRN reporter doesn’t question Swain’s asserted link between climate change and wildfires, despite the fact he provides no evidence to back it up. In fact, no such evidence exists.

Data show the number and severity of heatwaves, droughts, and wildfires have all decreased over the past century and a half, even as the planet has modestly warmed.

As summarized in Climate at a Glance: Heatwaves, data from the U.S. Climate Reference Network and the National Oceanic and Atmospheric Administration prove climate change has not increased the number or severity of heatwaves.

Indeed, in recent decades heatwaves have been far less frequent and severe, for example, than in the 1930s – nearly 100 years of global warming ago.

In fact, 40 states’ record-high temperatures were set before 1960, with 25 of the record highs being set or tied in the 1930s alone.

The most accurate nationwide temperature station network, implemented in 2005, shows no sustained increase in daily high temperatures in the United States since at least 2005.

Concerning droughts and wildfires, the data is just as clear – there has been a downward trend during the past century.

Data from the National Integrated Drought Information System (NIDIS) cited in Climate at a Glance: Drought shows droughts have not become more frequent or severe in recent years.

In point of fact, the evidence shows the United States is undergoing its longest period in recorded history without at least 40 percent of the country experiencing “very dry” conditions.

Indeed, in 2017 and 2019, the United States registered its smallest percentage of land area experiencing drought in recorded history.

And the U.N. Intergovernmental Panel on Climate Change (IPCC) reports with “high confidence” precipitation over mid-latitude land areas of the Northern Hemisphere (including the United States) has increased during the past 70 years, while IPCC has “low confidence” about any negative trends globally.

Regarding wildfires, since drought is the key contributing climate factor, one should not be surprised to find, as reported in Climate at a Glance: Wildfires, records from the U.S. National Interagency Fire Center (NIFC) show wildfires have declined in number and severity in recent decades.

The NIFC tracks data on U.S. wildfires back as far as 1926, and it shows the number of acres burned is far less now than it was throughout the early 20th century.

As the Figure below shows, current acres burned run about 1/4th to 1/5th of the record values which occurred in the 1930s.

Globally, the data on wildfires is just as clear. On page 67 of Bjorn Lomborg’s book False Alarm, he points to research demonstrating:

“There is plenty of evidence for a reduction in the level of devastation caused by fire, with satellites showing a 25 percent reduction globally in burned areas just over the past 18 years … In total, the global amount of area burned as declined by more than 540,000 square miles, from 1.9 million square miles in the early part of the last century to 1.4 million square miles today.”

While the economic costs of wildfires have increased in recent decades, that is due to ever greater numbers of people moving into, and communities expanding into, areas historically prone to wildfires.

Also, people are erecting ever more expensive homes, commercial developments, and related infrastructure there. Urban development in formerly rural, wildfire-prone areas is the reason economic costs from wildfires are increasing.

Scientists can say what they want and journalists can write what they want about climate change, but, if they say climate change is causing an increase in the number or severity of heatwaves, drought, and wildfires, they are lying. The data prove it.

Read more at Climate Realism

Hubble Hooks a Supernova Host Galaxy


purplish-blue "Meathook galaxy" against black backdrop of space

This image from the NASA/ESA Hubble Space Telescope features the spectacular galaxy NGC 2442, nicknamed the Meathook galaxy owing to its extremely asymmetrical and irregular shape.

This galaxy was host to a supernova explosion spotted in March 2015, known as SN 2015F, that was created by a white dwarf star. The white dwarf was part of a binary star system and siphoned mass from its companion, eventually becoming too greedy and taking on more than it could handle. This unbalanced the star and triggered runaway nuclear fusion that eventually led to an intensely violent supernova explosion. The supernova shone brightly for quite some time and was easily visible from Earth through even a small telescope until months later.

Text credit: ESA (European Space Agency)
Image credit: ESA/Hubble & NASA, S. Smartt et al.
Last Updated: Aug. 21, 2020Editor: Rob Garner

Like this:

Like Loading…


via Watts Up With That?

August 23, 2020 at 04:28PM

The Mathematics Of ‘COVID19’ Testing

The Basic Arithmetic: As usual, I am writing about the SARS-CoV2 (SC2) numbers from a Scottish perspective but the general conclusions apply to any country.

It is quite apparent from pronouncements from the First Minister, Ms Sturgeon, that she has no idea what the numbers she is communicating to the general public actually mean.

The Daily Record journalists Chris McCall and Peter Davidson wrote the following nonsense: “Over the last 24 hours 47 new cases of the deadly virus were identified with no deaths since July 16.”

It is perhaps the noted failings of the Scottish education system (once regarded as one of the finest in the world) that a journalist could write such condescending drivel.

A deadly virus has not killed anyone in a month?

The 47 cases increased to 65 following Sturgeon’s usual cringe-worthy daily speech on the 14th August.

Tara Fitzpatrick of the same rag tells us that “The total number of cases of the virus now stands at 19,238 throughout the country.”

The first major problem is that there was 13,991 ‘tests’ given to identify these 65 ‘cases’. That’s a 0.46% rate of detection.

The second major problem is that there cannot possibly be 19,238 cases throughout the country.

Quite apart from the fact that nearly 5,000 of them have died (with COVID19 mentioned on the Death Certificate), the figure is a cumulative number of those who have tested ‘positive’ for SC2.

So when does a ‘COVID19 case’ cease to be a ‘case’?

The estimated time from infection to either death or release from hospital is 22-24 days. Anyone who tested ‘positive’ more than 24 days before today cannot be considered a case. They have either recovered or are now dead.

Let’s be kind and measure the Government figures from July 1st 2020. They are as I expected, a complete mess. Before 1st April (17,007 tests) we don’t know where they were taken. The total test numbers do not agree with each other. They are all over the place. A note excusing the mess reads:

As of 8 July, (iv) includes numbers of home tests and other tests through the social care portal that were not previously available from the UK Government (UKG) testing programme.

The increase in the cumulative figures in (iv) on 8 July is due to the addition of the backlog of care home portal tests and home tests to the database. It is not possible to incorporate these in the previous daily figures.

However, we have some numbers we can work with from 1st July

  • 987 people have tested positive.
  • There have been 435,617 tests. (179,933 through NHS labs, 261,684 through UK Gov Tests)

The number tested is, without doubt, a biased sample. We don’t know who was tested, where or by whom. But it amounts to 8% of the total 5.5 million population of Scotland.

The worst case scenario is that 987 people in the country may have been infected by SC2.

That is 0.22% of people tested have shown as ‘positive’. This is the prevalence of SC2 in Scotland.

This is the number that Sturgeon relies on to continue the “war against the virus”.

The graph above shows the true extent of SC2 in Scotland. I prefer to show a graph from actual data to scale to give a realistic picture of what is actually happening.

You will need a magnifying glass to see the number of deaths per day and the number of ‘positive’ test results per day. That’s not an illusion. It is the actual numbers.

Feel free to disagree, but the numbers strongly suggest that SC2 had burned out by mid May. Positive test results have been almost non-existent since then despite the massive increase in testing.

Before we get on to the mathematics of SC2 testing, I would like to look at the overall cost of this search for the proverbial needle in a haystack.

Now I don’t know the actual cost of a single test the Scottish government actually pays. I did look at the Lloyds pharmacy website and the cost of 1 antibody kit is £59 and a Swab Test Kit is £99. I know that the government will have extra overheads including labour, lab costs, transportation, handling etc. So let’s say a range from £90 to £150 per test.

Given  435,617 tests from 1st July we have guesstimates of:

  • £34.85 million to £65.34 million spent so far to find nothing.

The Mathematics

What we want to know is the probability of being infected with SC2 given that you test positive.

You may think that a PCR test which is 97% accurate means that it is 97% certain that you have been infected. This would seem like a reasonable assumption given the accuracy of the test.

In this example I will use Bayes Theorem to calculate the likelihood of being positive given that you test positive. Things are not quite as simple as they seem.

Bayes can be described in its simplest terms as: “a mathematical equation used in probability and statistics to calculate conditional probability. In other words, it is used to calculate the probability of an event based on its association with another event.

As an example.  Let’s say Liverpool FC are playing Norwich at Anfield. Your initial estimate based on prior games, league position and player strength may lead you to the conclusion that the chance of a Liverpool win would be 80%. Then you learn that 3 Liverpool players have been sent off in the first 20 minutes. Liverpool’s chance of a win now drops to 10%.

Your prior beliefs change when new information is known.

Given that a SC2 test is 97% accurate what is the likelihood that you are really positive given that there is only a 0.3 chance of having the disease. We start with the assumption that the test is 97% accurate and then update our assumption with the 0.3% prevalence.

At it’s simplest, Bayes Theorem can be written as :

Where P(A|B) means the probability (P) of event A given event B.

Before we start, it is important to describe a few of the terms we will use.


This the probability expressed as a static number of testing positive and actually being positive.

Some tests will be classified as having 99% specificity but it is important to know that this is not a static number. It can vary tremendously on different factors. For this analysis 97% is used.


This is the probability expressed as a static number of testing negative and actually being negative. Again, this can vary tremendously. For this analysis 97% is used.


This is the probability expressed as a static number of the amount of people actually being infected in the population. 0.3% has been used in this analysis.

No Covid

This is the people who do are not infected. 1 – prevalence. 99.7% has been used in this analysis based on the positive test returns.

False Positive

This is the probability of testing positive without being infected. 3% has been used in this analysis. 1 – sensitivity.

False negative

This is the probability of testing negative and being infected. 3% has been used in this analysis. 1 – specificity.

When working out the equations, all percentage figures are written as 97% = 0.97 etc.

Writing this out in terms of Bayes we come up with the following formula:

P(Pos | Cov) is Sensitivity = 0.97

P(Cov) is Prevalence = 0.003

P(Pos|NoCov = 0.03

P(NoCov) = 0.997

Gives 0.97 * 0.003 = 0.00291

Divided by:

0.00291 + (0.03 * 0.997) = 0.00291 + 0.02991 = 0.03282

Gives  0.00291 / 0.03282 = 0.08872

Or an 8.87% chance of actually having the disease.


Not only has it been shown that the number of cases has been wildly misreported and the prevalence of the SC2 virus has almost disappeared, the outbreak almost certainly burned itself out in mid May.

Using Bayes analysis and conditional probability shows that the likelihood of being infected with SC2, even if you test positive is around 10%. (Probabilities are not static numbers. They will have a range given the observed data)

Despite media reports of SC2 infections being underestimated, the opposite is true. They are grossly overestimated.

One further point which I think is vital is that people, whether asymptomatic or symptomatic, are not being tested for anything other that SC2.

There is a large amount of time, money and effort being spent trying to find something which has such a low probability of actually being measured.

I’m not a virologist or an epidemiologist or a serologist. I just look at the numbers.


  1. Thanks to Michael A Lewis for the graphic of Bayes formula as related to SC2. This is a must read article from the Significance magazine published by the Royal Statistical Society and the American Statistical Association.
  2. In medicine the use of Bayes Theorem is called PPV (Positive Predictive Value). There are many online calculators available to check the numbers for yourself.

About the author: Graeme McMillan graduated with Honours from Edinburgh University  (Politics, Philosophy and Economics) then gained further qualifications in Business Analysis, Project management, Statistics and Programming. He has had a long career in the UK mining industry in mine planning, geological modelling and computer systems management and programming. He then moved to the metals and minerals sector developing simulation modelling of materials handling (scrap, slag, semi-finished and finished products) as well as financial modelling of large mining and steel plant projects. Latterly, he ran his company’s Middle East and North African division. Graeme is now retired.

Learn The Truth About The Risk Of Nuclear Power

This past 11th of March 2020 was the 9th anniversary of the 2011 Fukushima Daiichi nuclear accident. Fears of radiation from the triple meltdown led to the long-term relocation of over 100,000 people. In a landmark study completed in late 2017, a group of UK scientists set out to find the truth about the risk of nuclear.

They showed that not only was the scale of this relocation far too large, but that the evacuation itself led to thousands of unnecessary deaths from mental and physical exhaustion.

Despite initial interest from UK and US authorities, little has changed in the way governments plan to deal with future nuclear accidents. In this deep-dive interview with the group’s lead author, I look at why, almost a decade on, governments, regulators and the nuclear industry are so resistant to change, and whether this means we might be sleepwalking into another nuclear public health disaster.

Key Points

For those short on time, here are the key take-aways:

  • Between 5 and 10 times too many peoplewere moved away from the Chernobyl area between 1986 and 1990
  • No one should have been evacuated from Fukushima due to radiation
  • If Japan and Germany had reduced coal instead of nuclear after Fukushima, they could have together prevented about 28,000 air pollution-induced premature deaths
  • There were 1,121 deaths in the first 3 years from physical and mental exhaustion in Fukushima evacuees, which translates to an average loss of life expectancy from being evacuated greater than the loss of life expectancy people would have incurred by staying put
  • Loss of life expectancy in worst-affected Fukushima town less than that experienced by Londoners due to air pollution
  • The J-value provides an ethical and mathematically rigorous way to make decisions about what to do during and after a nuclear accident.
  • Remediation and food bans are good value for money
  • The presumption that long term relocations are a good policy tool needs re-evaluating

This article was originally published in the UK Nuclear Institute’s Nuclear Future magazine in July 2020.

Image for post
The tsunami turned to rubble whole towns like Rikuzentakata, Iwate.

The 11th of March, 2011, felt like any other Friday in Ishinomaki, Miyagi Prefecture, Japan. In a town known for its oysters, local fishermen rose early to put to sea. Shopkeepers busied themselves with the end of week trade. Office workers sat at their computer screens. At the Okawa elementary school, children practiced their reading and recited their times tables, awaiting the school bell that would announce the weekend. But the school bell never did ring that day, and it hasn’t rung since.

At 14:46 JST a magnitude 9.0–9.1 earthquake occurred 70 km off the coast of Oshika Peninsula. It was the most powerful ever recorded in Japan, and the fourth most powerful ever recorded anywhere in the world. The earthquake moved the main island of Japan 2.4 m to the east, and shifted the whole Earth up to 25 cm on its axis.

More destructive than the earthquake itself was the ensuing tsunami, whose waves reached 40.5 m high, traveled at speeds of 700 km/h and raced up to 10 km inland. Those living close to the sea had little time to react following the earthquake before raging torrents of water, mud and debris engulfed them.

Ishinomaki was one of the worst-affected areas. Along with most of the town, Okawa school was completely destroyed. 74 out of a total of 108 students lost their lives, along with 10 out of the 13 teachers.

The 2011 Tōhoku earthquake and tsunami was a disaster of epic proportions. It killed around 19,000 people and forced the evacuation of hundreds of thousands more. Roads and rail were destroyed. Power lines were toppled, water supply and sewage treatment lost. Schools, workplaces and most government services ceased to operate. Despite this, most of the world associates but one word with this disaster: Fukushima.

The Fukushima Daiichi Accident

The earthquake and tsunami victims in Miyagi, Iwate and Fukushima received relatively little media coverage outside Japan, with most airtime given over to grainy shots of the three damaged Fukushima Daiichi plants and rushed TV interviews with nuclear “experts”.

Imagine having everything you know swept away in a tsunami, to lose family and friends under a crashing wave of destruction, only to pick up from TV and radio that the real danger was of you being fatally irradiated by a triple nuclear meltdown. Nuclear radiation must be more dangerous than anything you can possibly imagine.

Very serious decisions were made on the assumption that nuclear meltdowns are the deadliest of all possible accidents. Following the incident, 111,000 people were required to leave areas around Fukushima Daiichi, and an additional 49,000 joined the exodus voluntarily; about 85,000 had not returned to their homes by 2015.

The Challenge Of Mass Evacuations

Mass evacuations are not simple things. They mean taking over 100,000 people out of their homes, away from their jobs and their local services. In Japan’s case, you have to find somewhere for them to live on a small, densely-populated island. For the adults, you have to find them a job. For the children, you have to find them a school. Any residents who do stay are now living in a ghost town with no local services, including even basic things like food, water and medicine.

Now, imagine that tens of thousands of the residents are elderly and may have limited mobility and poor health. Some may need caregivers and medical facilities to support them. Some of the adults will have disabilities or mental health problems. Such people are likely to require support to evacuate, and possibly part or even full-time health and social care. Next, consider the potential thousands of hospital patients, some of whose lives may depend on modern hospital equipment, put them onto trolleys, drive them to another hospital and hope they can be taken in.

Remember, all this may be happening in a wider disaster zone, such as happened with the tsunami in Japan. Roads may be blocked, rail infrastructure inoperable, power out, and food and water in short supply. Hazardous chemicals may have been released from nearby industry, such as from the fires at petrochemical plants in Sendai during the Tōhoku tsunami.

Image for post
Smoke from the Sendai Nippon Oil refinery

Consider the mental stress of being relocated, losing your job, your school, your neighbours, as well as the paralysing fear that you and your family have received a potentially fatal dose of radiation (otherwise, why were you evacuated?). As the relocation draws on, imagine the social dislocation this creates. In desperation, you might turn to drinking, smoking or drug abuse. You might start taking excessive risks because you think you’re doomed anyway, or perhaps you retreat into a shell and suffer from PTSD (post-traumatic stress disorder).

All these effects were detected among Fukushima evacuees, as well as those relocated from around the Chernobyl nuclear accident. This begs the question: do such evacuations represent good policy? How do we know whether a mass evacuation is “worth it”? Are there alternatives?

Coping With A Nuclear Accident

It was with these questions in mind that Professor Philip Thomas of Bristol University started the NREFS project (Management of Nuclear Risk Issues: Environmental, Financial and Safety) in 2012, jointly funded by the UK Engineering and Physical Sciences Research Council (EPSRC) and the Atomic Energy Commission of India.

While accidents at nuclear plants are very rare, it is impossible to say that they will never occur. As Prof Thomas says, “I’ve often met with the reaction that we should make sure accidents don’t happen. And that’s fine. But accidents do happen, they have happened — and what do you do then?” The NREFS project sought to measure objectively the effectiveness of actions (usually referred to as ‘countermeasures’) a government could take following an accident, principally evacuation, sheltering (staying indoors for a period of hours to days), bans on the consumption of locally grown foods, remediation (cleaning of buildings and soils to remove contamination) and long-term relocation.

Believe it or not, no one had ever before measured — objectively — the effectiveness of such countermeasures, despite the huge social impact they incur. To do this, four independent strategies were developed by academics at Open University, University of Warwick, University of Manchester and City, University of London (Prof. Thomas moved from City to Bristol University during the project). The first strategy was to deploy a tool called the Judgement, or J-value.

Prof. Thomas developed the J-value in the 2000s to help engineers decide which safety systems provided the best value-for-money in terms of safety benefit. Most recently, the J-value has been used to assess the effectiveness of lockdown measures during the COVID-19 outbreak. The J-value builds on a widely-used economics indicator called the Life Quality Index, which is an ethical way to combine life expectancy at birth with gross domestic product (GDP) per person to determine how much money it makes sense to spend on reducing exposure to risk.

Unlike the United Nations’ Human Development Index (HDI), the Life Quality Index is derived rigorously from the economics of human welfare. As governments only have a limited amount of money to spend on all the services they provide, this means that there are not infinite funds to be spent responding to a nuclear accident; £1 spent on evacuating residents is £1 less that is spent on schools, hospitals and roads.

The J-value thus cleverly provides an objective way to measure whether countermeasures such as evacuations are “worth it”. It all gets a bit technical, but the J-value is measuring whether the gain in life expectancy afforded by a countermeasure like evacuation is worth the cost in £s to implement that measure.

As the J-value is a mathematically rigorous approach, economists and statisticians are able to pick apart the methodology and propose tweaks where they disagree. However, there have been no papers that refute NREFS’s findings. And almost uniquely for the social sciences, the NREFS team were able to empirically validate the J-value method by using it to predict life expectancy in 180 out of the 193 countries recognised by the United Nations.

Image for post
The x-axis shows GDP per capita in 2005 international dollars, the y-axis shows life expectancy at birth. Each dot represents a particular country. (Source: By Radeksz — Own work, Public Domain,

The Big Surprise

A massive 335,000 people were relocated after Chernobyl, never to return. For Fukushima Daiichi, around 111,000 people were forced to evacuate, with close to 50,000 following voluntarily; about 85,000 had not returned to their homes by 2015. Prof. Thomas and his team set about applying the J-value to both these accidents. What they discovered was so surprising that even they found the results hard to believe.

Chernobyl was pretty much the worst nuclear accident imaginable. Due to poor design and mismanagement, the reactor exploded, raining pieces of fuel into the nearby fields and setting alight graphite in the core.

The J-value method suggested that for Chernobyl an evacuation was indeed a good idea. “We looked at Chernobyl,” says Prof. Thomas, “and we came to the conclusion that you would be looking to evacuate people once their loss of life expectancy was greater than 9 months.” As the area around Chernobyl was relatively poor and the cost of moving people was relatively high, the J-value says it would have been better for local people to stay put unless their loss of life expectancy from staying was greater than 9 months.

Image for post
In 2009, over two decades after the Chernobyl incident, the Azure Swimming Pool in Pripyat shows decay after years of disuse (Source: Photo by Timm Suess from Basel, Switzerland — Swimming Pool Hall 4 (Flickr), CC BY-SA 2.0)

Prof. Thomas continues, “Because we had quite extensive data on contamination levels on the various places around Chernobyl, we could work out how many people would be losing that amount of life expectancy. The answers we came back with surprised us.”

The analysis showed that between 5 and 10 times too many people were moved away from the Chernobyl area between 1986 and 1990. The initial evacuation should have been limited to 31,000 people, instead of 116,000. The second evacuation in 1990 of 220,000 people was entirely unjustified, affording a gain in life expectancy of just 24 days.

The surprising results don’t arise from some new, slap-dash way of measuring risk from radiation. The methodology uses approaches recommended by various United Nations committees, including the “Linear No Threshold” model for low doses of radiation, which some radiation experts believe overestimates the risk of a small amount of radiation. “The evacuation was very effective in reducing the dose and increasing life expectancy,” says Prof. Thomas, “but we found that it was well over the top.”

As for measuring the costs of evacuation, “We were very conservative about this,” says Prof. Thomas, “we looked purely at the cost of relocation, and didn’t factor in the economic, social and health costs of the disruption caused by evacuation.” If such effects were factored in, he says, this would reduce the recommended number of evacuees even further.

Prof. Thomas fears that as well as the Chernobyl relocations being far too large in scale, those that were relocated, many of whom received government pensions as accident victims, suffered psychological effects for many years: “A lot of the people evacuated thought, ‘If the government is spending this amount of money on me, I must have suffered something truly dreadful.’ I think there has been enormous psychological damage, which has led to loss of life expectancy in itself.” If such negative effects were accounted for this would also strengthen the argument for no evacuation, or at least only a short, temporary one, Prof. Thomas adds.

So what about Fukushima, where residents affected by the meltdowns were also living in a tsunami disaster zone? The authors found it difficult to justify moving anyone on grounds of the risk from radiation. The average loss of life expectancy from radiation if everyone had stayed put was found to be 19 days.

Prof. Thomas explains that “for the worst-affected town, Tomioka, the loss of life expectancy there would have been less than 3 months, and they shouldn’t have been evacuated at all,” although he noted temporary evacuation may well have been necessary for some due to tsunami and earthquake damage.

George Monbiot explains how the Fukushima nuclear accident made him pro-nuclear.

To put that in context, the two and a half month loss in life expectancy in Tomioka would have been less than the four and a half months loss of life expectancy experienced by all Londoners due to air pollution.

Sadly, of the 160,000 people evacuated due to radiation fears, there were 1,121 deaths in the first 3 years from physical and mental exhaustion. This translates to an average loss of life expectancy due to being evacuated of 37 days, which is more than the loss of life expectancy people would have incurred by staying put.

One interesting finding was that remediation, i.e. the cleaning and decontaminating of urban and agricultural areas, was found to be very good value-for-money. Food bans were also found to be effective.

The J-value was just one aspect of NREFS. Mathematicians at the University of Manchester performed an independent economic analysis looking at hundreds of potential nuclear accidents around the world. Prof. Thomas explains they concluded that “in very few cases was it sensible to relocate anyone permanently [following an accident]. In some cases it might be an idea to move people out temporarily, for a month or so, look at remediation and other things, but to get them back home.”

A third approach managed by the Open University and Public Health England (the public body in the UK responsible for human health issues, including those related to radiation) used the latest computer modelling tools to predict what would happen if there were an accident at a fictional nuclear reactor on the English South Downs. “[The researchers] came to the conclusion that, on average, long-term relocation would be limited to around 600 people,” says Prof. Thomas.

The only rebuttal to NREFS’s findings was in the form of a letter from a group of history and communications academics (Kasperski et al.) who questioned the J-value methodology, as well as some of the NREFS team’s assumptions. Prof. Thomas believes that their reply addressed all of the points raised in the letter.

Ground-Breaking Results

NREFS surprised everyone. Even the authors found it hard to believe what the numbers were telling them. The results seem to fly in the face of everything society assumes about the risk of nuclear accidents.

The papers were published in a special issue of Process Safety and Environmental Protection in December 2017. At close to 40,000 downloads, it’s among the top 5 downloaded issues of all time. Prof. Thomas and his team were invited to present their findings at the UK All-party Parliamentary Group for Nuclear Energy, as well as at the US Federal Emergency Management Agency, the American Nuclear Society, an Anglo-French cooperation event at the British Embassy in Paris, and many international conferences on radiation protection.

In Japan, where the NREFS findings are perhaps most sensitive, the response was overwhelmingly positive. Prof. Thomas was invited to speak at Fukushima Medical University and Hiroshima University, and has even been awarded a Japan Society for the Promotion of Science International Fellowship for 2020, which will see him work with local universities on remediation at Fukushima.

Image for post
Fukushima Medical University (Source: Photo by Kozo — 投稿者により撮影, Public Domain)

“One of the things we’re keen to avoid is putting any blame on Japanese authorities,” says Prof. Thomas. “No one has experience of this, we’ve only had one big nuclear accident [before Fukushima], and they more or less copied the mass evacuations that happened at Chernobyl. They didn’t have the benefit of our work — no one knew better at the time. The key thing is that anything that happens from now on needs to bear [our findings] in mind.”

Scientists at Fukushima Medical University and Hiroshima University have independently reached the same conclusions as the NREFS project: that the scale and duration of the Fukushima relocations cannot be justified based on the risk from radiation. They also looked at public perception of the severity of the accident and found that the further away from Fukushima Daiichi you lived, the worse you believed the accident to have been.


Published on August 20, 2020

Written by David Watson

Really Concerned About CO2 Emissions? Embrace Nuclear

In the words of James Hansen, the scientist most responsible for promoting global warming, wind and solar are “grotesque” solutions for reducing CO2 emissions.

Michael Shellenberger, a prominent activist, has the same opinion. Hansen and Shellenberger, as well as many other global warming activists, have come to the conclusion that nuclear energy is the only viable method of reducing CO2 emissions from the generation of electricity.

Nuclear reactors don’t emit CO2. Coal and natural gas do. Hydroelectric electricity does not emit CO2 either, but opportunities for expansion are limited. In the United States, most of the good sites have already been developed.

Wind and solar are grotesque because there are many problems. Promoters of wind and solar simply lie about the problems.

Reducing emissions of CO2 by one metric tonne, 1,000 kilograms, or 2,204 pounds, is called a carbon offset. Carbon offsets are bought and sold, usually for less than $10 each.

If you build wind or solar plants to displace electricity from natural gas or coal plants, you will generate carbon offsets. Each carbon offset generated will cost about $60 if electricity from a coal plant is displaced.

If electricity from a natural gas plant is displaced, the cost per carbon offset will be about $160. Wind and solar are expensive methods of generating carbon offsets.

Wind and solar are not remotely competitive with coal or natural gas for generating electricity. The promoters of wind and solar lie about this constantly, claiming that they are close to competitive.

The lies have two major components. They ignore or misrepresent the massive subsidies that wind and solar get, amounting to 75% of the cost. Then they compare the subsidized cost of wind or solar with the total cost of gas or coal.

But wind or solar can’t replace existing fossil fuel infrastructure because they are erratic sources of electricity.

The existing infrastructure has to be retained when you add wind or solar because sometimes the wind doesn’t blow or the sun doesn’t shine.

The only fair comparison is to analyze the total cost of wind or solar per kilowatt-hour (kWh) with the marginal cost of gas or coal electricity. That marginal cost is essentially the cost of the fuel.

The only economic benefit of wind or solar is reducing fuel consumption in existing fossil fuel plants.

It is hard to build wind or solar installations that generate electricity for less than 8-cents per kWh, but the cost of the fuel, for either gas or coal, is about 2-cents per kWh. Wind and solar cost four times too much to be competitive.

Wind and solar run into difficulty if they are the source of more than about 25% of the electricity in a grid.

Getting to 50% generally involves adding expensive batteries, further destroying the economics, and the usefulness for CO2 reduction.

The only justification for wind and solar is the reduction of CO2 emissions, but wind and solar are limited and costly for this purpose. CO2-free nuclear energy can be both economical and practical.

That, clearly is the reason why prominent global warming activists are advocating nuclear, rather than wind and solar to alleviate the supposed global warming crisis.

Neither nuclear nor coal is currently cost-competitive with natural gas. It’s not that nuclear or coal is so expensive as it is that natural gas, thanks to fracking, is incredibly cheap.

Gas that costs more than $10 per MMBtu (million British thermal units) a decade ago, now costs less than $2. Gas-generating plants are very cheap to build and incredibly efficient.

A gas plant using a combination of a gas turbine and a steam turbine can turn 65% of the energy in the gas into electricity. By contrast, a coal plant struggles to reach 40%.

Both coal and nuclear are handicapped by well-organized and unprincipled political opposition from the Sierra Club and similar organizations. The Sierra Club hates natural gas too, but most of their efforts go into scaring people with the imaginary danger of coal.

The Sierra Club doesn’t need to expend much effort scaring people with nuclear because the nuclear industry has already been destroyed in the U.S. thanks to previous efforts of the environmental movement.

Coal and nuclear have one very important advantage over gas. They have fuel on-site to continue operating if fuel deliveries are interrupted. For coal, this is around 30 days, for nuclear, more than a year.

Some gas plants can temporarily use oil from local tanks, but in most cases that won’t last long. Gas deliveries can be interrupted by pipeline failure or sabotage.

The pumping stations on natural gas pipelines are increasingly powered by electricity, rather than gas, creating a circular firing squad effect.

Nuclear electricity is a young industry with a big future. That future is materializing in Asia given the successful propaganda campaign to make people afraid of nuclear in the U.S. and in much of Europe.

Nuclear fuel is extremely cheap, around four times cheaper than gas or coal. Nuclear reactors don’t have smokestacks and they don’t emit CO2.

New designs will dramatically lower costs, increase safety, and effectively remove most of the objections to nuclear.

It is an incredible contradiction that most environmental organizations advocate wind and solar but demonizes nuclear. In the future, nuclear may be cost-competitive with natural gas.

It is an intellectual and economic failure that the 30 U.S. states with policies designed to reduce CO2 emissions, called renewable portfolio standards, mostly explicitly exclude nuclear power as part of the plan.

Instead, they effectively mandate wind and solar. There are signs of reform as some states have provided support to prevent nuclear power stations from being closed.

The global warming hysteria movement is surely one of the most successful junk science campaigns ever launched. Predicting a catastrophe is a great way for a science establishment to gain fame and money.

The many responsible scientists that object are attacked, if not fired. Money trumps ethics every time. The environmental movement needs looming catastrophes too, so they act as PR men for the science establishment.

The tragedy is that our legislators swallow these lies and waste billions on boondoggles like wind and solar.

It is ironic that increasing the CO2 in the atmosphere has a bountiful effect on plant growth, greening the Earth, and increasing agricultural production. Rather than a threat, CO2 is a boon.

If you still believe in the global warming hysteria movement, you should face reality and dump wind and solar for nuclear. Wind and solar are not appropriate for the problem they are assigned to solve. Nuclear is.

Norman Rogers is the author of the book: Dumb Energy: A Critique of Wind and Solar Energy.

Read more at American Thinker

Published on August 20, 2020

Written by Norman Rogers

Megatons of Solar Panel trash coming to a dump near you soon…

Solar Power, not-so-sustainable?

Solar panels need a special kind of recycling that costs 4 to 8 times as much as the recycled bits and bobs are worth. And the first major generation of solar panels will hit their use-by date soon.

Solar Panels Are Starting to Die, Leaving Behind Toxic Trash

Maddie Stone, Wired

By 2050, the International Renewable Energy Agency projects that up to 78 million metric tons of solar panels will have reached the end of their life, and that the world will be generating about 6 million metric tons of new solar e-waste annually. While the latter number is a small fraction of the total e-waste humanity produces each year, standard electronics recycling methods don’t cut it for solar panels. Recovering the most valuable materials from one, including silver and silicon, requires bespoke recycling solutions.

The solar sleeper awakes:

Most solar manufacturers claim their panels will last for about 25 years, and the world didn’t start deploying solar widely until the early 2000s. As a result, a fairly small number of panels are being decommissioned today. PV Cycle, a nonprofit dedicated to solar panel take-back and recycling, collects […]Rating: 0.0/10 (0 votes cast)

via JoNova

August 23, 2020 at 01:21PM