Probability of extinction for various types of catastrophes

By Vasco Grilo🔸 @ 2022-10-09T15:30 (+16)

Summary

Type of catastrophe (in the 21st century)Probability of extinction (%)
Any*4.26
Any1.77
Artificial intelligence2.84
Climate change and geoengineering0.0106
Nanotechnology0.245
Nuclear war0.299
Synthetic biology0.220
Other0.695

Acknowledgements

Thanks to David Denkenberger, Eli Lifland, Gregory Lewis, Misha Yagudin, Nuño Sempere, and Tamay Besiroglu.

Methods

I calculated the probability of extinction for catastrophes in the 21st century caused by:

The results for “any” do not explicitly depend on those of the 6 1st types of catastrophe mentioned above, whereas those for “any*” are calculated assuming independence between them.

The inputs to the calculations are:

Concretely, I calculated the probability of extinction from the sum of the following 3 products (see tab “Probability of extinction by catastrophe” of this Sheet):

Population loss

I computed the probability of the population loss falling into each of the 3 population loss ranges presented above based on the complementary cumulative distribution function (CCDF) of the population loss (see tab “Probability of the population loss”). I assumed the CCDF decreases linearly between each consecutive pair of the following points[3] (see tab “CCDF of the population loss”):

I set the probabilities required to determine the CCDF for the population losses of 10 % and 95 % to Metaculus’ community predictions (collected in tab “Metaculus' predictions”).

Probability of extinction

I calculated the probability of extinction for each of the 3 population loss ranges presented above for 3 exhaustive scenarios (see tab “Probability of extinction by scenario”):

To illustrate what is intended by “major infrastructure damage” and “major climate change”, Luisa writes:

For the 1st and 2nd scenarios, I determined the probability of extinction from its mean value for each of the population loss ranges. For the 3rd one, I computed it from the geometric mean between the values for the 1st and 2nd scenarios.

I supposed the probability of extinction as a function of the population loss to increase linearly between each consecutive pair of the following points:

PE_1, PL, PE_2 and PE_3 are the geometric means between the lower and upper bounds of the best guesses provided by Luisa here:

Probability of the scenarios

My guesses for the probability of each of the 3 scenarios defined in the previous section given a population loss caused by a certain type of catastrophe is in the table below (and in tab “Probability of extinction scenarios by catastrophe”). I calculated the probability for the type “other” from the mean of the probability for the other types of catastrophes (excluding “any”), and the one for the type “any” from the mean of the probability of the various types weighted by their probability of leading to a population between 95 % and 1.

Type of catastrophe (in the 21st century)Probability of scenario given a population loss
No major infrastructure damage nor climate changeMajor infrastructure damage and climate changeEither major infrastructure damage or climate change
Any29.2 %22.5 %48.3 %
Artificial intelligence1/41/41/2
Climate change and geoengineering001
Nanotechnology01/32/3
Nuclear war01/32/3
Synthetic biology100
Other25.0 %18.3 %56.7 %

Results

The tables below contain the results for:

Type of catastrophe (in the 21st century)Probability (%) of a population loss between…
0 to 10 %10 % to 95 %95 % to 1
Any*10025.310.3
Any68.027.84.16
Artificial intelligence90.42.407.20
Climate change and geoengineering98.41.580.0160
Nanotechnology99.00.5380.422
Nuclear war90.49.220.384
Synthetic biology90.48.740.864
Other92.65.671.69
ScenarioProbability of extinction (%) for a population loss between…
0 to 10 %10 % to 95 %95 % to 1
No major infrastructure damage nor climate change00.041325.1
Major infrastructure damage and climate change0.1762.0555.1
Either major infrastructure damage or climate change00.29137.2
Type of catastrophe (in the 21st century)Probability of extinction (%) for a population loss between…
0 to 10 %10 % to 95 %95 % to 10 to 1 (total)
Any*0.1800.1413.954.26
Any0.02690.1711.571.77
Artificial intelligence0.03970.01602.782.84
Climate change and geoengineering00.004610.005950.0106
Nanotechnology0.05800.004710.1820.245
Nuclear war0.05290.08080.1660.299
Synthetic biology00.003610.2170.220
Other0.02980.03120.6340.695

Discussion

Probability of extinction by scenario

The relative importance of major infrastructure damage and climate change decreases as the severity of the population loss increases. The ratio between the probability of extinction without major infrastructure damage nor climate change and the probability of extinction with both is (see cells F3:F5 of tab “Probability of extinction by scenario”):

This tendency seems correct, as the probability of extinction is 1 for a population loss of 1 regardless of infrastructure damage and climate change.

Probability of extinction by type of catastrophe

Comparison of absolute values with the GCRS

In the table below (and in tab “Comparison of absolute values with the GCRS”), I compare the probability of extinction by type of catastrophe in the 21st century I estimated with ones I derived from the 2008 Global Catastrophic Risks Survey (GCRS), whose results are presented in this report by Anders Sandberg and Toby Ord from the Future of Humanity Institute[4] (see tab “2008 Global Catastrophic Risks Survey”). The GCRS estimates refer to the period from 2009 to 2099, but I adjusted them to the period from 2023 to 2100 assuming constant risk. Additionally, I derived GCRS’ estimate for “other” risks assuming independence between the types of catastrophes[5].

Type of catastrophe (in the 21st century)Probability of extinction (%) for a population loss between…
My analysis (%)GCRS (%)Absolute difference to GCRS (pp)Relative difference to GCRS (%)
Any*4.2616.5-12.3-74.2
Any1.7716.5-14.8-89.3
Artificial intelligence2.844.30-1.46-34.0
Nanotechnology0.2454.30-4.06-94.3
Nuclear war0.2990.858-0.558-65.1
Synthetic biology0.2201.72-1.50-87.2
Other0.6956.46-5.76-89.2

My probabilities of extinction are lower than those I derived from the GCRS for all types of catastrophe. Nanotechnology has the largest relative difference, and artificial intelligence the smallest.

The GCRS did not address “climate change and geoengineering”, but my estimate of 0.0106 % is similar to:

Comparison of priorities with The Precipice

Ultimately, what is the most relevant for prioritisation is how the various probabilities compare with each other. Having this in mind, in the table below (and in tab “Comparison of priorities with The Precipice”), I present the probability of extinction in the 21st century as a fraction of that for “any*”, and the existential risk between 2021 and 2120 guessed by Toby Ord in The Precipice (see tab “Existential risk estimates from The Precipice”) as a fraction of the total. The existential risk for “other” was estimated from those for “unforeseen anthropogenic risk” and “other anthropogenic risk” assuming independence between them.

Type of catastropheNormalised probability of extinction for a catastrophe in the 21st century (%)Normalised existential risk from 2021 to 2120 (%)RatioDecimal logarithm of the ratio
Artificial intelligence66.660.01.110.0455
Climate change and geoengineering0.2480.6000.413-0.384
Nuclear war7.030.60011.71.07
Synthetic biology5.1720.00.259-0.587
Other16.331.60.516-0.287

Relative to Toby Ord’s best guesses, my analysis suggests the relative importance of:

The adequacy of this comparison depends on the extent to which probability of extinction is a good proxy for existential risk.

Quality of the inputs

In essence, the results I obtained are a function of guesses from Metaculus’ forecasters, Luisa Rodriguez, and me. I should note there is margin to improve the quality of the inputs:

That being said, for the reasons outlined by Scott Alexander here, I believe establishing priorities based on a quantitative model with guessed inputs is often better than guessing priorities.

  1. ^

     To clarify, the probability refers to catastrophes occurring during the 21st century, but the extinction may happen afterwards.

  2. ^

     The results in the Sheet are updated automatically as the Metaculus’ predictions change.

  3. ^

     This implies the probability density function (PDF) of the population loss is uniform for each of the 3 ranges.

  4. ^

     More existential risk estimates are available in this database, which was introduced by Michael Aird here.

  5. ^

     This implies the GCRS’ estimates for “any*” are the same as for “any”.

  6. ^

     â€śWith those caveats in my mind, my best guess estimate is that the indirect risk of existential catastrophe due to climate change is on the order of 1 in 100,000, and I struggle to get the risk above 1 in 1,000. Working directly on US-China, US-Russia, India-China, or India-Pakistan relations seems like a better way to reduce the risk of Great Power War than working on climate change”. I guess John’s best guess for the total risk of existential catastrophe due to climate change is similar to John’s best guess for the indirect risk, which equals John’s upper bound for the direct risk: “I [John] construct several models of the direct extinction risk from climate change but struggle to get the risk above 1 in 100,000 over all time”.

  7. ^

     â€śThat said, we [80,000 Hours] still think this risk is relatively low. If climate change poses something like a 1 in 1,000,000 risk of extinction by itself, our guess is that its contribution to other existential risks is at most a few orders of magnitude higher — so something like 1 in 10,000”.

  8. ^

     Metaculus’ predictions are at the bottom of Eli’s personal tier list for how much weight to give to AI existential risk forecasts (see this footnote for details).