[link] "The Survival of Humanity"

https://www.lesswrong.com/posts/GqBQhEmopi8LbKPpM/link-the-survival-of-humanity

The Survival of Humanity, by Lawrence Rifkin (September 13, 2013). Some excerpts:

An existential catastrophe would obliterate or severely limit the existence of all future humanity.

As defined by Nick Bostrom at Oxford University, an existential catastrophe is one which extinguishes Earth-originating intelligent life or permanently destroys a substantial part of its potential. As such it must be considered a harm of unfathomable magnitude, far beyond tragedy affecting those alive at the time. Because such risks jeopardize the entire future of humankind and conscious life, even relatively small probabilities, especially when seen statistically over a long period of time, may become significant in the extreme. It would follow that if such risks are non-trivial, the importance of existential catastrophes dramatically eclipse most of the social and political issues that commonly ignite our passions and tend to get our blood boiling today. [...]

One would think that if we are mobilized to fight for issues that affect a relatively small number of people, we would have an even stronger moral and social emotional motivation to prevent potential catastrophes that could kill or incapacitate the entire human population. But there are significant psychological barriers to overcome. People who would be emotionally crushed just hearing about a tortured child or animal may not register even the slightest emotional response when contemplating the idea that all human life may one day become extinct. As Eliezer Yudkowsky wrote, "The challenge of existential risks to rationality is that, the catastrophes being so huge, people snap into a different mode of thinking." [...]

Here is a partial list of suggestion worthy of consideration. The idea here is not to advocate for some extreme survivalist or "Chicken Little" mentality, but rather to use reason, foresight, and judgment about how best to protect our future.

The idea is not that we should do all these, but that the issue deserves our very highest consideration.

Comment

https://www.lesswrong.com/posts/GqBQhEmopi8LbKPpM/link-the-survival-of-humanity?commentId=77sRuQyDRbXD6RKHv

Most of those (8/​14) bear little to no import on actual short- or medium-term x-risks.

Comment

https://www.lesswrong.com/posts/GqBQhEmopi8LbKPpM/link-the-survival-of-humanity?commentId=hRYFp78sMoRfpX8qC

Global food reserves are currently under one year worth (according to a UN report). As a result, if some kind of plague wipes out grain yields (or some other major food source) for this year we would be looking at a massive die off or a collapse of modern society.

Infectious diseases are a pretty obvious x-risk.

Asteroids are also pretty obvious x-risks.

As is open-sourced bio-terrorism.

Genetic diversity of wild species, granted, isn’t going to cause human extinction in the short or medium term.

For space stations and other off world sustainable settlements, see the above threats of asteroids and plagues, as well as the not putting all your eggs in one basket concept. The expansion of the sun is not a near term threat, but multiple colonies is a pretty blanket x-threat protection.

Protecting food production from environmental change and environmental threats. See above regarding our current food stocks and vulnerabilities to damage to this industry.

Climate changed caused pandemics due to the migration of pathogens to areas without prepared immune systems. Sounds like mapping these movements is an important way of reducing a near term x-risk.

Nano-tech. I’m not sure how you would define short- or medium-term, but I suspect "within the next few decades" counts. Having legislation in place before nano-robotics takes of seems like an important near-term step to take.

Nuclear weapons. The ability of multiple competing nations to nuke each other back to the stone age, held in check only by public opinion and a mode of thought so insane even the acronym is MAD. Nuclear weapons count as an x-threat (IMHO).

For a protected and isolated colony, see my notes on space colonies above. A large number of threats can be prevented by having a backup population.

Ideological absolutism; aka the kind of mentality that leads to terrorism or otherwise thinking "everyone who X must die, at any collateral cost".

Evidence based thinking is another one of those general things that make new x-risks less likely to emerge and more likely to be discovered before it is too late, but I’ll agree that the main effects would be other benefits.

Pandemics again.

I got 12-13 with clear relevance, so clearly we disagree about some of them. Which of the above would you not count as an x-threat?

https://www.lesswrong.com/posts/GqBQhEmopi8LbKPpM/link-the-survival-of-humanity?commentId=BezaStKRfkQxod929

I’m not positive which ones you mean, but if you mean things like famine prevention and climate change, then I think I disagree with you. I think that anything that triggers the collapse of civilization does constitute an extensional risk, because I think there is a significant chance that in a deep and total civilization collapse (on the scale of the fall of the Roman Empire, say) that humanity would never recover.

People in Europe survived the collapse of the Roman Empire because they were able to fall back on older iron-age technologies, which were good enough to keep a decent percentage of the people alive. But we are much farther away from those now, practical knowledge of how to survive without modern technology is much more rare, the environment is much more degraded, population would have to fall much faster, and in any case a collapse that large and that broad would have a significant risk of leading to a nuclear war or other catastrophe. I would say that there is at least a 10%-20% chance that a total collapse of civilization would lead to either human extinction or to humans never again reaching the current level of technology.