Nuclear safety/security: Why doesn't EA prioritize it more?
By Rockwell @ 2023-08-30T21:43 (+33)
I'm dissatisfied with my explanation of why there is not more attention from EAs and EA funders on nuclear safety and security, especially relative to e.g. AI safety and biosecurity. This has come up a lot recently, especially after the release of Oppenheimer. I'm worried I'm not capturing the current state of affairs accurately and consequently not facilitating fully contextualized dialogue.
What is your best short explanation?
(To be clear, I know many EAs and EA funders are working on nuclear safety and security, so this is more so a question of resource allocation, rather than inclusion in the broader EA cause portfolio.)
Jackson Wagner @ 2023-08-30T22:36 (+22)
I'm definitely not deeply familiar with any kind of "official EA thinking" on this topic (ie, I don't know any EAs that specialize in nuclear security research / grantmaking / etc). But here are some things I just thought up, which might possibly be involved:
- Neglectedness in the classic sense. Although not as crowded as climate change, there are other large organizations / institutions that address nuclear risk and have been working in this space since the early Cold War. (Here I am thinking not just about charitable foundations, but also DC think-tanks, university departments, and even the basic structure of the US military-industrial complex which naturally involves a lot of people trying to figure out what to do about nuclear weapons and war.)
- Nuclear war might be slightly lower-ranked on the importance scale of a very committed and philosophically serious longtermist, since it seems harder for a nuclear war to literally kill everyone (wouldn't New Zealand still make it? etc), than sufficiently super-intelligent AI or a sufficiently terrifying engineered bioweapon. So this places nuclear war risk somewhere on a spectrum between being a direct existential threat, vs being more of an "existential risk factor" (like climate change). Personally, I find it hard to bite that longtermist bullet all the way, emotionally. (ie, "The difference between killing 99% of people and 100% of people, is actually a bazillion times worse than the difference between killing 99% versus 0%".) So I feel like nuclear war pretty much maxes out my personal, emotional "importance scale". But other people might be be better than me at shutting up and multiplying! (And/or have higher odds than me that civilization would eventually be able to fully recover after a nuclear war.)
- Tractability, in the sense that a lot of nuclear policy is decided by the US military-industrial complex (and people like the US president), in a way that seems pretty hard for the existing EA movement to influence? And then it gets even worse, because of course the OTHER half of the equation is being decided by the military-industrial complexes of Russia, China, India, etc -- this seems even harder to influence! By contrast, AI safety is hugely influenceable by virtue of the fact that the top AI labs are right in the bay area and their researchers literally go to some of the same social events as bay-area EAs. Biosecurity seems like a middle-ground case, where on the downside there isn't the crazy social overlap, but on the plus side it's a partly academic field which is amenable to influence via charities, academic papers, hosting conferences, advocating for regulation, trying to spread good ideas via podcasts and blog posts, etc...
- Tractability, in a different sense, namely that it's pretty unclear exactly HOW to reduce the risk of a nuclear war, which interventions are helpful vs harmful, etc. For instance, lots of anti-nuclear activists advocate for reducing nuclear stockpiles (which certainly seems like would help reduce the severity of a worst-case nuclear war), but my impression is that many experts (both within EA and within more traditional bastions of nuclear security research) are very uncertain about the impact of unilaterally reducing our nuclear stockpiles -- for example, maybe it would actually increase the damage caused by a nuclear war if we got rid of our land-based "nuclear sponge" ICBMs? Besides severity, what impact might reduced stockpiles have on the likelihood of nuclear war, if any? My impression is that these kinds of tricky questions are even more common in nuclear security than they are in the already troublesome fields of AI safety and biosecurity.
If I had to take a wild guess, I would say that my first Tractability point (as in, "I don't know anybody who works at STRATCOM or the People's Liberation Army Rocket Force") is probably the biggest roadblock in an immediate sense. But maybe EA would have put more effort into building more influence here if we had prioritized nuclear risk more from the start -- and perhaps that lack of historical emphasis is due to some mix of the other problems I mentioned?
Davidmanheim @ 2023-08-31T05:58 (+6)
This gets a lot of things right, but (knowing some of the EAs who did look into this or work on it now,) I would add a few:
1. Lindy effect and stability - we're 70 years in, and haven't had any post-first-use nuclear usage, so we expect it's somewhat stable - not very stable, but the risk from newer technologies under this type of estimation is higher, because we have less of a record.
2. The current inside-view stability of the nuclear situation, where strong norms exist against use, and are being reinforced already by large actors, with deep pockets.
3. There seems to be a pretty robust expert consensus about the problem, and it concludes that there is little to be done other than on the margin.
Also, note that this was investigated as a cause area early on by Open Philanthropy, and then was looked at by Longview more recently. Both decided to have it as a small focus, rather than a key area. Edit (to correct a mistake): It was looked at by Longview more recently, and they have highlighted the topic significantly more, especially in the wake of other funders withdrawing support.
jackva @ 2023-08-31T10:09 (+18)
This characterization seems pretty at odds to me with recent EA work, e.g. from Longview but also my colleague Christian Ruhl at FP, who tend to argue that the philanthropic space on nuclear risk is very funding-constrained and there are plenty of good funding margins left unfilled.
christian.r @ 2023-08-31T13:07 (+14)
For anyone who is interested, Founders Pledge has a longer report on this (with a discussion of funding constraints as well as funding ideas that could absorb a lot of money), as well as some related work on specific funding opportunities like crisis communications hotlines.
Davidmanheim @ 2023-08-31T15:57 (+4)
Thanks for the correction. I think you're right, and have edited the last bit above to say:
Also, note that this was investigated as a cause area early on by Open Philanthropy, and then was looked at by Longview more recently. Both decided to have it as a small focus, rather than a key area. Edit (to correct a mistake): It was looked at by Longview more recently, and they have highlighted the topic significantly more, especially in the wake of other funders withdrawing support.
Denkenberger @ 2023-09-06T07:13 (+4)
Neglectedness in the classic sense. Although not as crowded as climate change, there are other large organizations / institutions that address nuclear risk and have been working in this space since the early Cold War.
I agree that the nuclear risk field as a whole is less neglected than AGI safety (and probably than engineered pandemic), but I think that resilience to nuclear winter is more neglected. That's why I think overall cost-effectiveness of resilience is competitive with AGI safety.
Nathan Young @ 2023-08-31T13:09 (+3)
It seems very unlikely that a nuclear war will kill all of us, compared to biorisk where this seems more possible.
Not sure this should affect funding in general, but explicitly longtermist funders will therefore weight biorisk more.
Davidmanheim @ 2023-08-31T17:44 (+9)
I'll point out that I'm skeptical that most biorisks could be existential risks - I think they are plausible global catastrophic risks, and overlapping risks could lead to extinction, but I think people's idea of a disease that kills over 99.9% of the global population is very unlikely even when accounting for near-term potential bioweapons.
christian.r @ 2023-08-31T13:21 (+4)
This does help answer the question, but it conflates extinction risk with existential risk, which is I think a big mistake in general. This chapter in How Worlds Collapse does a nice job of explaining this:
"Currently, existential risk scholars tend to focus on events and processes for which a fairly direct, simple story can be told about how they could lead to extinction. [...] However, if there is a substantial probability that collapse could destroy humanity's longterm potential [including by recovery with bad values], this should change one's view of catastrophic risks..."
To be clear, I think there are other good reasons to weight biorisk more (as I do).
Nathan Young @ 2023-08-31T14:03 (+2)
Okay but I just think that's not that common a view. If you leave 1,000 - 10,000 humans alive, the longterm future is probably fine. So that's the existential risk reduction down by 60 - 90%
Matt_Lerner @ 2023-08-31T15:51 (+9)
If you leave 1,000 - 10,000 humans alive, the longterm future is probably fine
This is a very common claim that I think needs to be defended somewhat more robustly instead of simply assumed. If we have one strength as a community, is in not simply assuming things.
My read is that the evidence here is quite limited, the outside view suggests that losing 99.9999% of a species / having a very small population is a significant extinction risk, and that the uncertainty around the long-term viability of collapse scenarios is enough reason to want to avoid near-extinction events.
Nathan Young @ 2023-08-31T16:15 (+2)
Why do I think 1,000 -10,000 humans is probably (60 - 90%) fine?
According to Luisa Rodriguez, you need about 300 people to rebuild the human race.
And again, even if this happens in some places — even if some groups fought each other until they literally ended up starving to death — it would be completely bizarre for it to happen to every group in the world. You just need one group of around 300 people to survive for them to be able to rebuild the species.
These people seem likely to be very incentivised towards survival - humans generally like surviving. It would be awful for them, sure, but the question is would they rebuild us as a species? And I think the answer is probably.
And let's remember that this is the absolute worst case scenario. The human race has twice dropped nuclear bombs and then never again. It seems a big leap to imagine that not only will we do so but we will wipe ourselves out to the extent of only 1 such group.
Every successive group that could rebuild the human race is extra. I imagine that actually 100s of millions would survive an actual worldwide nuclear war, so the point we are litigating is a very small chance anwyay.
the outside view suggests that losing 99.9999% of a species / having a very small population is a significant extinction risk
I don't really know what base rates I'd use here. Feels like you want natural disasters rather than predation. When the meteor hit do we know how population size affected repopulation? Even then, humans are just way more competent than any other animals. So as I said originally we might be looking at a 10 - 40% chance given the near worst case scenario, but I don't buy your outside view.
I'd be curious what others outside views are here and if anyone has actual base rates on disaster driven animal populations and repopulation.
As an aside,
I think needs to be defended somewhat more robustly instead of simply assumed
I disagree. I've said what I think, you can push back on it if you want, but why is it bad to "simply assume" my view rather than yours?
Matt_Lerner @ 2023-08-31T16:51 (+7)
My point is precisely that you should not assume any view. My position is that the uncertainties here are significant enough to warrant some attention to nuclear war as a potential extinction risk, rather than to simply bat away these concerns on first principles and questionable empirics.
Where extinction risk is concerned, it is potentially very costly to conclude on little evidence that something is not an extinction risk. We do need to prioritize, so I would not for instance propose treating bad zoning laws as an X-risk simply because we can't demonstrate conclusively that they won't lead to extinction. Luckily there are very few things that could kill very large numbers of people, and nuclear war is one of them.
I don't think my argument says anything about how nuclear risk should be prioritized relative to other X-risks, I think the arguments for deprioritizing it relative to others are strong and reasonable people can disagree; YMMV.
Nathan Young @ 2023-08-31T17:09 (+1)
My argument does say something about how nuclear risk shoud be prioritised. It is a lower priority if both existed. Maybe much lower.
The complicated thing is that nuclear risks do exist whereas biorisk and AI risk are much more speculative in terms of actually existing. In this sense I can believe nuclear should be funded more.
Matt_Lerner @ 2023-08-31T17:11 (+4)
I think your arguments do suggest good reasons why nuclear risk might be prioritized lower; since we operate on the most effective margin, as you note, it is also possible at the same time for there to be significant funding margins in nuclear that are highly effective in expectation.
Nathan Young @ 2023-08-31T17:38 (+2)
Do you work on researching nuclear risk?
How do you think this disagreement could be more usefully delineated. It seems like there is some interesting disagreement here?
Denkenberger @ 2023-09-06T07:10 (+2)
I'm not Matt, but I do work on nuclear risk. If we went down to 1000 to 10,000 people, recovery would take a long time, so there is significant chance of supervolcanic eruption or asteroid/comet impact causing extinction. People note that agriculture/cities developed independently, indicating it is high probability. However, it only happened when we had a stable moderate climate, which might not recur. Furthermore, the Industrial Revolution only happened once, so there is less confidence that it would happen again. In addition, it would be more difficult with depleted fossil fuels, phosphorus, etc. Even if we did recover industry, I think our current values are better than randomly chosen values (e.g. slavery might continue longer or democracy be less prevalent).
Nathan Young @ 2023-09-07T12:58 (+4)
This feels too confident. A nuclear war into a supervolcano is just really unlikely. Plus if there were 1000 people then there would be so much human canned goods left over - just go to a major city and sit in a supermarket.
If a major city can support a million people for 3 days on its reserves it can support a 1000 people for 30 years.
Again, I'm not saying that I think it doesn't matter, but I think my answers are good reasons why it's less than AI
Denkenberger @ 2023-09-08T01:14 (+2)
A nuclear war into a supervolcano is just really unlikely.
A nuclear war happening at the same time as a supervolcano is very unlikely. However, it could take a hundred thousand years to recover population, so if the frequency of supervolcanic eruptions is roughly every 30,000 years, it's quite likely there would be one before we recover.
Plus if there were 1000 people then there would be so much human canned goods left over - just go to a major city and sit in a supermarket.
The scenario I'm talking about is one where the worsening climate and loss of technology means they would not be enough food, so the stored food would be consumed quickly. Furthermore, edible wild species including fish may be eaten to extinction.
Again, I'm not saying that I think it doesn't matter, but I think my answers are good reasons why it's less than AI
I agree that more total money should be spent on AGI safety than nuclear issues. However, resilience to sunlight reduction is much more neglected than AGI safety. That's why the Monte Carlo analyses found that the cost-effectiveness of resilience to loss of electricity (e.g. high-altitude detonations of nuclear weapons causing electromagnetic pulses) and resilience to nuclear winter are competitive with AGI safety.
Mohammad Ismam Huda @ 2023-09-01T05:48 (+2)
I agree with you, it is disappointing that EA are doing little in this area.
In Australia, we have a speaker from ICAN (the nobel prize winning anti-nuclear weapons NGO) attending the 2023 EAGX Australia in Melbourne. In my opinion, it's a particularly promising area for big impact (and especially for Aussie EAs) due to the recently developed AUKUS alliance. The details of the alliance are still being fleshed out, and a big opportunity exists to shape the alliancce to de-risk the chances of a conflict between great powers.