Combination Existential Risks

By ozymandias @ 2019-01-14T19:29 (+27)

This is a linkpost to https://thingofthings.wordpress.com/2019/01/14/combination-existential-risks/

I think this idea is pretty obvious so I’m sure someone has thought of it before, but I haven’t been able to find it in the existential risk literature, so I thought it was worth writing up.

Existential risk is often talked about in terms of single causes of human extinction: for example, a misaligned superintelligence, a bioengineered pandemic, or an asteroid. That makes sense, because if a single event can eradicate humanity, it’s worth your time to try to prevent it.

However, among nonhuman animals, extinctions very commonly have many causes. For example, the rhino is endangered because of poaching and habitat loss. Giraffes are endangered due to habitat loss and fragmentation, poaching, and the effects of civil unrest. The black-footed ferret is endangered because of habitat loss and population decline among its prey, the prairie dog, which in turn was caused by a combination of hunting and plague.

I don’t expect these case studies to be completely analogous to human extinction: habitat loss and hunting are not serious existential risks for humans. However, I think it is worth noting that there are many species for whom the combination of several threats posed an existential risk, even if any individual threat was not an existential risk in and of itself. I will call this “combination existential risks.”

What combination existential risks might humans face?

Most obviously and analogously to the nonhuman case, several forces may interact to reduce human populations. For example, a nuclear war may eradicate human life outside of New Zealand. The population shrinks as New Zealanders have difficulty adjusting to the new post-nuclear-war world. New Zealand then experiences a serious pandemic and a series of severe natural disasters, which reduce the population further. At that point, Allee effects come into play and humanity slowly goes extinct. In this scenario, stressors that individually would not be sufficient to cause human extinction (a nuclear war, a pandemic, natural disasters in New Zealand) combine to make it inevitable.

Secondly, an event may cause extinction to occur when it would otherwise not have occurred. For example, climate change may lead to severe international tensions, which results in a World War III that causes the annihilation of mankind. Alternately, the international tensions associated with climate change may cause an artificial intelligence arms race, which leads to the deployment of a misaligned superintelligence.

Third, two events may both be necessary for human extinction to happen. For example, a misaligned somewhat-but-not-extremely superhuman intelligence commissions a bioengineering lab to make a virus. The lab fails to recognize that the virus is a bioengineered pandemic that might wipe out humanity and manufactures the virus. In this case, both the misaligned somewhat-superhuman intelligence and the poor biosecurity protocols were necessary for human extinction to occur. (Of course, both poor biosecurity protocols and misaligned superintelligences are very risky on their own.)

This is, of course, not intended to be a complete taxonomy of potential combination existential risks.

I think people who think about existential risk should devote some of their energy to thinking about risks that are not themselves existential but might be existential if combined with other risks. For example, climate change is not an existential risk, but it plausibly plays a role in many combination existential risks, such as by increasing international tensions or by rendering much of the globe difficult to inhabit. Similarly, many global catastrophic risks may in fact be existential if combined with other global catastrophic risks, such as a nuclear war combined with a pandemic.


Evan_Gaensbauer @ 2019-01-14T22:46 (+7)

The Global Catastrophic Risks Institute (GCRI) has a webpage up with their research on this topic under the heading 'cross-risk evaluation and prioritization.' Alexey Turchin also made this map of 'double scenarios' for global catastrophic risk, which maps out the pairwise possibilities for how two global catastrophic risks could interact.

avturchin @ 2019-01-15T21:30 (+5)

In fact, I tried also to explore this idea - which I find crucial - in my Russian book "Structure of global catastrophe", but my attempts to translate into English didn't work well, so I now slowly convert its content in the articles.

I would add an important link on the A Singular Chain of Events by Tonn and MacGregor, as well as work of Seth Baum on double catastrophes. The idea of "Peak everything" about simultaneous depletion of all natural resources also belong here, but should be combined with idea of Singularity as idea of acceleration of everything, which combined create very unstable situation.

Yannick_Muehlhaeuser @ 2019-01-16T13:37 (+2)
I think people who think about existential risk should devote some of their energy to thinking about risks that are not themselves existential but might be existential if combined with other risks. For example, climate change is not an existential risk, but it plausibly plays a role in many combination existential risks, such as by increasing international tensions or by rendering much of the globe difficult to inhabit. Similarly, many global catastrophic risks may in fact be existential if combined with other global catastrophic risks, such as a nuclear war combined with a pandemic.

I think those would be called 'Context Risk'. I haven't read that word in many places, but i first heard of it in Phil Torres' book about x-risks.

John_Maxwell_IV @ 2019-01-16T08:00 (+1)

Has there been any research into ways to address Allee effects? Seems like that could address a range of combination existential risks simultaneously.

Yoav_Ravid @ 2019-01-15T18:22 (+1)

80,000 hours made an article calculating the chance any existential risk will happen (combined their probabilities), not sure if this fully applies to what you meant, but it's something.