
From Watts Up With That?
News Brief by Kip Hansen
“In 2022, Congress passed the Global Catastrophic Risk Management Act (GCRMA). The GCRMA requires the Secretary of Homeland Security and the administrator of the Federal Emergency Management Agency (FEMA) to coordinate an assessment of global catastrophic and existential risk in the next 30 years.”

Now, the Homeland Security Operational Analysis Center (HSOAC) has produced this assessment. In reality, the report is produced by a unit of the RAND Corporation: “RAND’s Homeland Security Research Division (HSRD) operates the Homeland Security Operational Analysis Center (HSOAC)”.
The report, titled simply “Global Catastrophic Risk Assessment”, focuses on risk associated with six topics:
1) artificial intelligence, 2) asteroid and comet impacts, 3) nuclear war, 4) rapid and severe climate change, 5) severe pandemics, and 6) supervolcanoes.
Dr. Roger Pielke Jr. writes about the governmental report on his substack under his title “Global Existential Risks”. Pielke Jr.’s piece is well worth reading in its entirety, but is summarized, for readers here, in one excerpted chart from the full RAND-produced report followed by Pielke’s summary chart.


Read Pielke’s whole piece here.
# # # # #
Author’s Comment:
I mostly agree with Pielke Jr. And, yes, this section is longer than the main post.
AI will not become “sentient” and threaten mankind – it may cause havoc if allowed to direct or control anything whatever : AI is neither intelligent nor rational, it cannot tell truth from error, fact from fiction, and, like your five-year old, is perfectly happy just making things up and passing them off as reality.
Supervolcanoes are geological – they could and might also cause vast destruction but will not represent existential or globally catastrophic risk.
It is commonly believed that comet or asteroid strikes have occurred in the past and are a possibility – the risk depending on the size. And, nuclear (atomic) weapons have been used and could be again. A widespread all-out nuclear war would have the potential to be globally catastrophic and even existential.
Intentionally or accidentally super-lethalized disease agents could wipe out humanity – or enough of humanity to force us back into the stone age. It wouldn’t take too much of a reduction in population for us to lose our advanced technological abilities. Even the smartest of us could not start from step zero and produce computer chips or cell phone services or manufacture vaccines.
So, what risks should our governments and think tanks be focused on?
Hint: not climate change.
Thanks for reading.
# # # # #
Discover more from Climate- Science.press
Subscribe to get the latest posts sent to your email.

You must be logged in to post a comment.